Deep convolutional neural networks (DCNNs) have achieved surpassing success in the field of computer vision, and a number of elaborately designed networks refresh the performance records in benchmark datasets. Recently, evolutionary neural architecture search (ENAS) has become an emerging area, which could employ an evolutionary computation (EC) technique to automatically construct promising network architectures without human intervention. However, existing algorithms still have limitations: most standard EC approaches cannot process flexible-length architecture representations directly, and most fitness evaluation processes suffer from the exhibitive computational cost and the unreliable prediction results. To overcome these shortcomings, we propose an efficient particle swarm optimization (PSO)-based neural architecture search algorithm to search for appropriate dense blocks on image classification tasks, named EAEPSO. EAEPSO addresses the first limitation by designing an autoencoder to encode variable-length network representations as fixed-length latent vectors, which converts the original search space to a latent space that can facilitate the downstream search. Besides, an efficient and effective hierarchical fitness evaluation method is designed to guide the search process to address the second limitation. The experimental results show that EAEPSO is a very competitive ENAS algorithm that achieves an error rate of 2.74% on CIFAR-10 and 16.17% on CIFAR-100, and reduces the computational cost from hundreds or thousands of GPU-days to only 2.2 and 4 GPU-days, respectively. Further analyses investigate the reduced training data’s effect and confirm the effectiveness of both the proposed autoencoder and the proposed hierarchical fitness evaluation method.
Read full abstract