Abstract

Accelerating Deep Convolutional Neural Networks (CNNs) has recently received ever-increasing research focus. Among various approaches proposed in the literature, filter pruning has been regarded as a promising solution, which is due to its advantage in significant speedup and memory reduction of both network model and intermediate feature maps. Previous works utilized smaller-norm-less-important criterion to prune filters with smaller ࡁp-norm values by pruning and retraining alternately. This trends to narrow the model capacity for the following reasons: (1) Violent pruning. Previous works adopt a violent strategy in which all filters are simultaneously pruned, which leaving the room to retain model accuracy limited. (2) Filter degradation. Previous works simply set the pruned filter to 0 and retrained it alterately, which easily led to the loss of learning ability of filters. To solve this problem, we propose a novel filter pruning method, namely Incremental Filter Pruning via Random Walk (IFPRW). IFPRW solves the problem of violent pruning by incremental method and Filter degradation by means of random walk. When applied to two image classification benchmarks, the usefulness and strength of IFPRW is validated. Notably, on CIFAR-10, IFPRW reduces more than 46% FLOPs on ResNet-110 with even 0.28% relative accuracy improvement. Moreover, on ILSVRC-2012, IFPRW reduces more than 54% FLOPs on ResNet-101 with only 0.7% top-5 accurcacy drop. which proving that IFPRW outperforms the state-of-the-art filter pruning methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call