Abstract

Neural network pruning is a popular approach to reducing the computational complexity of deep neural networks. In recent years, as growing evidence shows that conventional network pruning methods employ inappropriate proxy metrics, and as new types of hardware become increasingly available, hardware-aware network pruning that incorporates hardware characteristics in the loop of network pruning has gained growing attention. Both network accuracy and hardware efficiency (latency, memory consumption, etc.) are critical objectives to the success of network pruning, but the conflict between the multiple objectives makes it impossible to find a single optimal solution. Previous studies mostly convert the hardware-aware network pruning to optimization problems with a single objective. In this paper, we propose to solve the hardware-aware network pruning problem with Multi-Objective Evolutionary Algorithms (MOEAs). Specifically, we formulate the problem as a multi-objective optimization problem, and propose a novel memetic MOEA, namely HAMP, that combines an efficient portfolio-based selection and a surrogate-assisted local search, to solve it. Empirical studies demonstrate the potential of MOEAs in providing simultaneously a set of alternative solutions and the superiority of HAMP compared to the state-of-the-art hardware-aware network pruning method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.