TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/EagleEye: Fast Sub-net Evaluation for Efficient Neural Net...

EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning

Bailin Li, Bowen Wu, Jiang Su, Guangrun Wang, Liang Lin

2020-07-06ECCV 2020 8Network Pruning
PaperPDFCode(official)

Abstract

Finding out the computational redundant part of a trained Deep Neural Network (DNN) is the key question that pruning algorithms target on. Many algorithms try to predict model performance of the pruned sub-nets by introducing various evaluation methods. But they are either inaccurate or very complicated for general application. In this work, we present a pruning method called EagleEye, in which a simple yet efficient evaluation component based on adaptive batch normalization is applied to unveil a strong correlation between different pruned DNN structures and their final settled accuracy. This strong correlation allows us to fast spot the pruned candidates with highest potential accuracy without actually fine-tuning them. This module is also general to plug-in and improve some existing pruning algorithms. EagleEye achieves better pruning performance than all of the studied pruning algorithms in our experiments. Concretely, to prune MobileNet V1 and ResNet-50, EagleEye outperforms all compared methods by up to 3.8%. Even in the more challenging experiments of pruning the compact model of MobileNet V1, EagleEye achieves the highest accuracy of 70.9% with an overall 50% operations (FLOPs) pruned. All accuracy results are Top-1 ImageNet classification accuracy. Source code and models are accessible to open-source community https://github.com/anonymous47823493/EagleEye .

Results

TaskDatasetMetricValueModel
Network PruningImageNetAccuracy77.1ResNet50-3G FLOPs
Network PruningImageNetAccuracy76.4ResNet50-2G FLOPs
Network PruningImageNetAccuracy74.2ResNet50-1G FLOPs
Network PruningImageNetAccuracy74.2ResNet50-1G FLOPs
Network PruningImageNetAccuracy70.7MobileNetV1-50% FLOPs

Related Papers

Hyperpruning: Efficient Search through Pruned Variants of Recurrent Neural Networks Leveraging Lyapunov Spectrum2025-06-09TSENOR: Highly-Efficient Algorithm for Finding Transposable N:M Sparse Masks2025-05-29Hierarchical Safety Realignment: Lightweight Restoration of Safety in Pruned Large Vision-Language Models2025-05-22Adaptive Pruning of Deep Neural Networks for Resource-Aware Embedded Intrusion Detection on the Edge2025-05-20Bi-LSTM based Multi-Agent DRL with Computation-aware Pruning for Agent Twins Migration in Vehicular Embodied AI Networks2025-05-09Guiding Evolutionary AutoEncoder Training with Activation-Based Pruning Operators2025-05-08ReplaceMe: Network Simplification via Layer Pruning and Linear Transformations2025-05-05Optimization over Trained (and Sparse) Neural Networks: A Surrogate within a Surrogate2025-05-04