Abstract
Neural Architecture Search (NAS) is an emerging solution to design a lightweight network for researchers to obtain a trade-off between accuracy and speed, releasing researchers from tedious repeated trials. However, the main shortcoming of NAS is its high and unstable memory consumption of the search work, especially for large-scale tasks. In this study, the proposed Asymptotic Neural Architecture Search network (ANAS) achieved a proxyless search for large-scale tasks with economic and stable memory consumption. Instead of proxy search like other NAS algorithms, ANAS achieved the large-scale proxyless search that directly learns deep neural network architecture for target task. ANAS reduced the peak value of memory consumption by an asymptotic method, and kept the memory consumption stable by the linkage change of a series of key indexes. A pruning operation and efficient candidate operations decreased the total memory consumption. Finally, ANAS achieved a good trade-off between accuracy and speed for classification tasks on CIFAR-10, CIFAR-100, and ImageNet datasets. Besides, except for the classification task, it achieved excellent multi-task transfer learning ability for implementing the segmentation task on CamVid and Cityscapes. ANAS reached 22.8% test errs with 5 M parameter on ImageNet, and 72.9 mIoU (mean Intersection over Union) with 119.9 FPS (Frames Per Second) on Cityscapes dataset.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.