Abstract

Neural architecture search (NAS) can automatically design architectures for deep neural networks (DNNs) and has become one of the hottest research topics in the current machine learning community. However, NAS is often computationally expensive because a large number of DNNs require to be trained for obtaining performance during the search process. Performance predictors can greatly alleviate the prohibitive cost of NAS by directly predicting the performance of DNNs. However, building satisfactory performance predictors highly depends on enough trained DNN architectures, which are difficult to obtain due to the high computational cost. To solve this critical issue, we propose an effective DNN architecture augmentation method named graph isomorphism-based architecture augmentation method (GIAug) in this article. Specifically, we first propose a mechanism based on graph isomorphism, which has the merit of efficiently generating a factorial of n (i.e., n) diverse annotated architectures upon a single architecture having n nodes. In addition, we also design a generic method to encode the architectures into the form suitable to most prediction models. As a result, GIAug can be flexibly utilized by various existing performance predictors-based NAS algorithms. We perform extensive experiments on CIFAR-10 and ImageNet benchmark datasets on small-, medium-and large-scale search space. The experiments show that GIAug can significantly enhance the performance of the state-of-the-art peer predictors. In addition, GIAug can save three magnitude order of computation cost at most on ImageNet yet with similar performance when compared with state-of-the-art NAS algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call