Abstract

AbstractArchitectures generation optimization has been received a lot of attention in neural architecture search (NAS) since its efficiency in generating architecture. By learning the architecture representation through unsupervised learning and constructing a latent space, the prediction process of predictors is simplified, leading to improved efficiency in architecture search. However, searching for architectures with top performance in complex and large NAS search spaces remains challenging. In this paper, an approach that combined a ranker and generative model is proposed to address this challenge through regularizing the latent space and identifying architectures with top rankings. We introduce the ranking error to gradually regulate the training of the generative model, making it easier to identify architecture representations in the latent space. Additionally, a surrogate‐assisted evolutionary search method that utilized neural network Bayesian optimization is proposed to efficiently explore promising architectures in the latent space. We demonstrate the benefits of our approach in optimizing architectures with top rankings, and our method outperforms state‐of‐the‐art techniques on various NAS benchmarks. The code is available at https://github.com/outofstyle/RAGS‐NAS.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call