Abstract
With the performance increase of convolutional neural network (CNN), the disadvantages of CNN's high storage and high power consumption are followed. Among the methods mentioned in various literature, filter pruning is a crucial method for constructing lightweight networks. However, the current filter pruning method is still challenged by complicated processes and training inefficiency. This paper proposes an effective filter pruning method, which uses the saliency of the feature map (SFM), i.e. information entropy, as a theoretical guide for whether the filter is essential. The pruning principle use here is that the filter with a weak saliency feature map in the early stage will not significantly improve the final accuracy. Thus, one can efficiently prune the non-salient feature map with a smaller information entropy and the corresponding filter. Besides, an over-parameterized convolution method is employed to improve the pruned model's accuracy without increasing parameter at inference time. Experimental results show that without introducing any additional constraints, the effectiveness of this method in FLOPs and parameters reduction with similar accuracy has advanced the state-of-the-art. For example, on CIFAR-10, the pruned VGG-16 achieves only a small loss of 0.39% in Top-1 accuracy with a factor of 83.3% parameters, and 66.7% FLOPs reductions. On ImageNet-100, the pruned ResNet-50 achieves only a small accuracy degradation of 0.76% in Top-1 accuracy with a factor of 61.19% parameters, and 62.98% FLOPs reductions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.