Abstract

In recent years, neural architecture search (NAS) methods have been proposed for the automatic generation of task-oriented network architecture in image classification. However, the architectures obtained by existing NAS approaches are optimized only for classification performance and do not adapt to devices with limited computational resources. To address this challenge, we propose a neural network architecture search algorithm aiming to simultaneously improve the network performance and reduce the network complexity. The proposed framework automatically builds the network architecture at two stages: block-level search and network-level search. At the stage of block-level search, a gradient-based relaxation method is proposed, using an enhanced gradient to design high-performance and low-complexity blocks. At the stage of network-level search, an evolutionary multiobjective algorithm is utilized to complete the automatic design from blocks to the target network. The experimental results demonstrate that our method outperforms all evaluated hand-crafted networks in image classification, with an error rate of 3.18% on Canadian Institute for Advanced Research (CIFAR10) and an error rate of 19.16% on CIFAR100, both at network parameter size less than 1 M. Obviously, compared with other NAS methods, our method offers a tremendous reduction in designed network architecture parameters.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.