Abstract

Neural Architecture Search (NAS) methods have been successfully applied to image tasks with excellent results. However, NAS methods are often complex and tend to converge to local minima as soon as generated architectures yield good results. This paper proposes GEA, a novel approach for guided NAS. GEA guides the evolution by exploring the search space by generating and evaluating several architectures in each generation at initialization stage using a zero-proxy estimator, where only the highest-scoring architecture is trained and kept for the next generation. Subsequently, GEA continuously extracts knowledge about the search space without increased complexity by generating several off-springs from an existing architecture at each generation. Moreover, GEA forces exploitation of the most performant architectures by descendant generation while simultaneously driving exploration through parent mutation and favouring younger architectures to the detriment of older ones. Experimental results demonstrate the effectiveness of the proposed method, and extensive ablation studies evaluate the importance of different parameters. Results show that GEA achieves competitive results on all data sets of NAS-Bench-101, NAS-Bench-201 and TransNAS-Bench-101 benchmarks, as well as in the DARTS search space.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.