Abstract

Neural architecture search (NAS) is an exciting new field in automating machine learning. It can automatically search for the architecture of neural networks. But the current NAS has extremely high requirements for hardware equipment and time costs. In this work, we propose a predictor based on Radial basis function neural network (RBFNN) as a surrogate model of Bayesian optimization to predict the performance of neural architecture. The existing work does not consider the difficulty of directly searching for neural architectures that meet the performance requirements of NAS in real-world applications. Meanwhile, NAS needs to execute multiple times independently when facing multiple similar tasks. Therefore, we further propose a multi-task learning surrogate model with multiple RBFNNs. The model not only functions as a predictor, but also learns knowledge of similar tasks jointly. The performance of NAS is improved by processing multiple tasks simultaneously. Also, the current NAS is committed to searching for very high-performance networks and does not take into account that neural architectures are limited by device memory during actual deployment. The scale of architecture also needs to be considered. We use a multi-objective optimization algorithm to simultaneously balance the performance and the scale, and build a multi-objective evolutionary search framework to find the Pareto optimal front. Once the NAS is completed, decision-makers can choose the appropriate architecture for deployment according to different performance requirements and hardware conditions. Compared with existing NAS work, our proposed MT-ENAS algorithm is able to find a neural architecture with competitive performance and smaller scale in a shorter time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call