Abstract

Channel pruning is an efficient technique for model compression, removing redundant parts of a convolutional neural network with minor degradation in classification accuracy. Previous criteria of channel pruning ignore neurons’ intrinsic relationship and the high correlation with input samples. Inspired by the visual crowding phenomenon in neuroscience, this paper presents a novel channel pruning method via reverse neuron crowding, dubbed CPRNC, to address this issue. First, CPRNC involves a neuron crowding degree measure (NCDM) module, which builds the relationship model among all artificial neurons by observing their crowding behaviors. Subsequently, each channel’s importance is evaluated by the crowding degree of corresponding channels. Considering that the channel importance is affected by the characteristic of input samples, CPRNC designs a neuron crowding degree recalibrate (NCDR) module. NCDR emphasizes discriminative samples to recalibrate the channel priority list generated by NCDM, further enhancing the precision of the pruning criterion. Experimental results show that CPRNC achieves performance that competes with state-of-the-art pruning methods, including dynamic channel pruning and learning-based pruning. For example, we prune ResNet-50 with 56.7% FLOPs on the large-scale dataset ImageNet1K with only a 0.19% decrease in accuracy. At low pruning rates, CPRNC achieves lossless compression, e.g., the pruned ResNet-56 on CIFAR-10 increases accuracy by 0.13% over the baseline model at 56.3% FLOPs reduction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call