Abstract

Network pruning aims to enhance the performance of deep neural networks by eliminating redundant components from the model. However, existing pruning methods typically require a well-trained model and employ fixed, single pruning criteria throughout the pruning cycles. To address these limitations, we propose a novel method called Evolutionary Filter Criteria (EvoFC). This method enables the automated search for the network pruning ratio and criterion during a population-based heuristic search process. We introduce a unique encoding space that represents the chosen pruning criterion and ratio for each layer, facilitating the acquisition of optimal architecture configurations for candidate networks during iterations. Additionally, we devise a novel weight inheritance mechanism to mitigate the computational burden associated with the population-based nature of the method, resulting in a significant reduction in overall training time. We validate our method by applying it to randomly initialized networks and conducting empirical experiments on CIFAR-10/100, ILSVRC-2012 and Places365 datasets. The results demonstrate that our method effectively reduces the number of FLOPs while striking a fine balance between accuracy and computational efficiency. This underscores the practical value of our method in optimizing performance while efficiently utilizing computational resources, particularly when pruning networks starting from random initialization.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call