Abstract

Using weight-sharing and continuous relaxation strategies, the gradient descent-based differential architecture search has achieved great success in automatically designing neural network architectures. However, unresolved issues, i.e., the local optimum dilemma of the gradient descent method, and the network performance collapse of the searched architecture with too many unreasonable operations, are still frustrating for researchers and practitioners. To address these two issues, a novel and efficient neural architecture search approach based on a hybrid evolutionary strategy, termed EST-NAS, is proposed in this paper. In particular, we propose using a new evolutionary strategy to explore various search directions based on the gradient descent-based neural network architecture search, aiming at obtaining a more excellent architecture. In the proposed EST-NAS, the gradient descent architecture search is performed first, and then the best architecture obtained is utilized to design an efficient initialization for the following evolutionary strategy-based architecture search. By hybridizing evolutionary strategy with gradient descent-based search, EST-NAS can improve the performance of the searched architecture with better search efficiency. Meanwhile, the validation accuracy is applied to directly measure the importance of operations, which reduces the error in the relationship between operation and task performance. Extensive experimental results in the various datasets on different search spaces show that the proposed EST-NAS achieves remarkably competitive performance with less search cost, compared to other state-of-the-art NAS approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call