Abstract

The emergence of neural architecture search (NAS) algorithms has removed the constraints on manually designed neural network architectures, so that neural network development no longer requires extensive professional knowledge, trial and error. However, the extremely high computational cost limits the development of NAS algorithms. In this article, in order to reduce computational costs and to improve the efficiency and effectiveness of evolutionary NAS (ENAS) is investigated. In this article, we present a fast ENAS framework for multiscale convolutional networks based on evolutionary knowledge transfer search (EKTS). This framework is novel, in that it combines global optimization methods with local optimization methods for search, and searches a multiscale network architecture. In this article, evolutionary computation is used as a global optimization algorithm with high robustness and wide applicability for searching neural architectures. At the same time, for fast search, we combine knowledge transfer and local fast learning to improve the search speed. In addition, we explore a multiscale gray-box structure. This gray box structure combines the Bandelet transform with convolution to improve network approximation, learning, and generalization. Finally, we compare the architectures with more than 40 different neural architectures, and the results confirmed its effectiveness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call