Abstract

Channel pruning has proven to be efficient compared to fine-grained pruning for the reason that it can achieve high compression rate and low computational complexity simultaneously without requiring additional hardware and software support for convolutional neural networks (CNNs). Most of the existing works rarely take correlation of filters into account which leads to the low pruning rates of channels and operations and large accuracy loss. To address this problem, we propose two novel channel pruning methods (CCP and SCRP) based on spectral clustering, which can efficiently find the correlation between filters and compress convolution neural network without obvious accuracy loss. In CCP, we first use spectral clustering algorithm to conduct unsupervised clustering analysis on input filters. Then the reserved channels are reconstructed by calculating the intra-category average value after clustering, which minimizes the loss between the pruned model and the pre-trained model. In order to reduce the loss of information due to pruning, we do sensitivity experiments with fine-tuning to determine the reasonable pruning rate of each layer. The sensitivity test with fine-tuning can better explain the effect of pruning, rather than directly observing the loss caused by pruning. Besides, we improve our CCP for higher sparse rate: SCRP. We add a spectral clustering loss which indeed helps to achieve higher accuracy and pruning rate in FLOPs to the classification loss. For example, our pruned VGGNet-16 achieves 93.66% accuracy with 84.6% reduction in FLOPs on CIFAR-10 and 73.34% accuracy with 76.8% reduction in FLOPs on CIFAR-100 (SCRP). Our code is avaliable at: https://github.com//tnn2018/Clustering-Channel-Pruning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call