Abstract

Hardware systems integrated with deep neural networks (DNNs) are deemed to pave the way for future artificial intelligence (AI). However, manually designing efficient DNNs involves nontrivial computation resources since significant trial-and-errors are required to finalize the network configuration. To this end, we, in this article, introduce a novel hardware-aware neural architecture search (NAS) framework, namely, GoldenNAS, to automate the design of efficient DNNs. To begin with, we present a novel technique, called dynamic channel scaling, to enable the channel-level search since the number of channels has non-negligible impacts on both accuracy and efficiency. Besides, we introduce an efficient progressive space shrinking method to raise the awareness of the search space toward target hardware and alleviate the search overheads as well. Moreover, we propose an effective hardware performance modeling method to approximate the runtime latency of DNNs upon target hardware, which is further integrated into GoldenNAS to avoid the tedious on-device measurements. Then, we employ the evolutionary algorithm (EA) to search for the optimal operator/channel configurations of DNNs, denoted as GoldenNets. Finally, to enable the depthwise adaptiveness of GoldenNets under dynamic environments, we propose the adaptive batch normalization (ABN) technique, followed by the self-knowledge distillation (SKD) approach to improve the accuracy of adaptive subnetworks. We conduct extensive experiments directly on ImageNet, which clearly demonstrate the advantages of GoldenNAS over existing state-of-the-art approaches.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.