Abstract

Neural Architecture Search (NAS) is a powerful tool for automating effective image and video processing DNN designing. The ranking of the accuracy has been advocated to design an efficient performance predictor for NAS. The previous contrastive method solves the ranking problem by comparing pairs of architectures and predicting their relative performance. However, it only focuses on the rankings between the two involved architectures and neglects the overall quality distributions of the search space, which may suffer generalization issues. On the contrary, we propose to let the performance predictor concentrate on the global quality level of specific architecture, and learn the tier embeddings of the whole search space automatically with learnable queries. The proposed method, dubbed as Neural Architecture Ranker with Query-to-Tier technique (NARQ2T), explores the quality tiers of the search space globally and classifies each individual to the tier they belong to. Thus, the predictor gains knowledge of the performance distributions of the search space which helps to generalize its ranking ability to the datasets more easily. Thanks to the encoder-decoder design, our method is able to predict the latency of the searched model without deteriorating the performance prediction. Meanwhile, the global quality distribution facilitates the search phase by directly sampling candidates according to the statistics of quality tiers, which is free of training a search algorithm, e.g., Reinforcement Learning or Evolutionary Algorithm, thus it simplifies the NAS pipeline and saves the computational overheads. The proposed NARQ2T achieves state-of-the-art performance on two widely used datasets for NAS research. Moreover, extensive experiments have validated the efficacy of the designed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call