Abstract

Pruning can remove redundant parameters and structures of Deep Neural Networks (DNNs) to reduce inference time and memory overhead. As an important component of neural networks, the feature map (FM) has stated to be adopted for network pruning. However, the majority of FM-based pruning methods do not fully investigate effective knowledge in the FM for pruning. In addition, it is challenging to design a robust pruning criterion with a small number of images and achieve parallel pruning due to the variability of FMs. In this paper, we propose Adaptive Knowledge Extraction for Channel Pruning (AKECP), which can compress the network fast and efficiently. In AKECP, we first investigate the characteristics of FMs and extract effective knowledge with an adaptive scheme. Secondly, we formulate the effective knowledge of FMs to measure the importance of corresponding network channels. Thirdly, thanks to the effective knowledge extraction, AKECP can efficiently and simultaneously prune all the layers with extremely few or even one image. Experimental results show that our method can compress various networks on different datasets without introducing additional constraints, and it has advanced the state-of-the-arts. Notably, for ResNet-110 on CIFAR-10, AKECP achieves 59.9% of parameters and 59.8% of FLOPs reduction with negligible accuracy loss. For ResNet-50 on ImageNet, AKECP saves 40.5% of memory footprint and reduces 44.1% of FLOPs with only 0.32% of Top-1 accuracy drop.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.