Abstract

With the emergence of deep neural networks, many research fields, such as image classification, object detection, speech recognition, natural language processing, machine translation and automatic driving, have made major breakthroughs in technology and the research achievements have been successfully applied in many real-life applications. Combining evolutionary computation and neural architecture search (NAS) is an important approach to improve the performance of deep neural networks. Usually, the related researchers only focus on precision. Thus, the searched neural architectures always perform poorly in the other indexes such as time cost. In this paper, a multi-objective evolutionary algorithm with a probability stack (MOEA-PS) is proposed for NAS, which considers the two objects of precision and time consumption. MOEA-PS uses an adjacency list to represent the internal structure of deep neural networks. Besides, a unique mechanism is introduced into the multi-objective genetic algorithm to guide the process of crossover and mutation when generating offspring. Furthermore, the structure blocks are stacked using a proxy model to generate deep neural networks. The results of the experiments on Cifar-10 and Cifar-100 demonstrate that the proposed algorithm has a similar error rate compared with the most advanced NAS algorithms, but the time cost is lower. Finally, the network structure searched on Cifar-10 is transferred directly to the ImageNet dataset, which can achieve 73.6% classification accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call