Recently, convolutional neural networks (CNNs) have achieved great success in the field of artificial intelligence, including speech recognition, image recognition, and natural language processing. CNN architecture plays a key role in CNNs' performance. Most previous CNN architectures are hand-crafted, which requires designers to have rich expert domain knowledge. The trial-and-error process consumes a lot of time and computing resources. To solve this problem, researchers proposed the neural architecture search, which searches CNN architecture automatically, to satisfy different requirements. However, the blindness of the search strategy causes a 'loss of experience' in the early stage of the search process, and ultimately affects the results of the later stage. In this paper, we propose a self-adaptive mutation neural architecture search algorithm based on ResNet blocks and DenseNet blocks. The self-adaptive mutation strategy makes the algorithm adaptively adjust the mutation strategies during the evolution process to achieve better exploration. In addition, the whole search process is fully automatic, and users do not need expert knowledge about CNNs architecture design. In this paper, the proposed algorithm is compared with 17 state-of-the-art algorithms, including manually designed CNN and automatic search algorithms on CIFAR10 and CIFAR100. The results indicate that the proposed algorithm outperforms the competitors in terms of classification performance and consumes fewer computing resources.
Read full abstract