Abstract

Filter pruning is proven to be an effective strategy in model compression. However, convolutional filter pruning methods usually pay all attention to evaluating filters’ importance at a single layer, ignoring their collaborative relationship with corresponding filters of the next layer. In this paper, we propose novel consecutive layer collaborative filter similarity (CLCS) to make full use of the complete filter information and learn binary selection vectors to prune the redundant filters automatically. With learned selection vectors, the pruning ratio of each layer can be determined, and we can also calculate the FLOPs of the candidate pruned network at the current stage. Under the accuracy constraint and the FLOPs constraint, the selection vectors of each layer can be optimized to achieve a better trade-off between accuracy and efficiency. Extensive experiments on CIFAR-10 and ImageNet with multiple networks demonstrate the effectiveness of our proposed method. Specifically, we obtain 54.29% and 67.33% FLOPs reduction with 0.01% and 0.09% accuracy improvement for ResNet-56 and ResNet-110 on CIFAR-10, respectively. On ImageNet, we reduce FLOPs by nearly half compared to the ResNet-50 baseline with almost no loss of accuracy. Compared with state-of-the-art filter pruning methods, our approach also achieves superior results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call