Abstract

Designing advanced neural architectures to tackle specific tasks involves weeks or even months of intensive investigation by experts with rich domain knowledge. In recent years, neural architecture search (NAS) has attracted the interest of many researchers due to its ability to automatically design efficient neural architectures. Among different search strategies, evolutionary algorithms have achieved significant successes as derivative-free optimization algorithms. However, the tremendous computational resource consumption of the evolutionary neural architecture search dramatically restricts its application. In this paper, we explore how fitness approximation-based evolutionary algorithms can be applied to neural architecture search and propose NAS-EA-FA to accelerate the search process. We further exploit data augmentation and diversity of neural architectures to enhance the algorithm, and present NAS-EA-FA V2. Experiments show that NAS-EA-FA V2 is at least five times faster than other state-of-the-art neural architecture search algorithms like regularized evolution and iterative neural predictor on NASBench-101, and it is also the most effective and stable algorithm on NASBench-201. All the code used in this paper is available at https://github.com/fzjcdt/NAS-EA-FA.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call