Abstract

Motivated by the observation that most neural architecture search (NAS) methods are time-consuming because a “training process” is required to evaluate each searched neural architecture, this paper presents an efficient NAS algorithm based on a promising metaheuristic algorithm named search economics (SE) and a new training-free estimator to evaluate the searched neural architectures for not only obtaining a good neural architecture but also accelerating the computation time. The basic idea of the proposed NAS algorithm is to use the so-called expected value of each region in the search space to guide the search so that it will focus on searching high potential regions instead of solutions with high objective values in particular regions. To evaluate the performance of the proposed algorithm, we compare it with state-of-the-art non-training-free and training-free NAS methods. Experimental results show that the proposed algorithm is capable of finding a result that is similar to or better than those found by most non-training-free NAS algorithms compared in this study but taking only a tiny portion of the computation time.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.