Abstract

Neural architecture search (NAS) aims to provide a manual-free search method for obtaining robust and high-performance neural network structures. However, limited search space, weak empirical reusability, and low search efficiency limit the performance of NAS. This study proposes an evolutionary knowledge-reconstruction-assisted method for neural network architecture searches. First, a search space construction method based on network blocks with a-priori knowledge of the network morphism is proposed. This can reduce the computational burden and the time required for the search process while increasing the diversity of the search space. Next, a hierarchical variable-length coding strategy is designed for application to the complete evolutionary algorithm; this strategy divides the neural network into two layers for coding, satisfies the need for decoding with neural network weights, and achieves coding of neural network structures with different depths. Furthermore, the complete differential evolution algorithm is used as the search strategy, thus providing a new possibility of using the search space based on network morphism for applications related to evolutionary algorithms. In addition, the results of comparison experiments conducted on CIFAR10 and CIFAR100 indicate that the neural networks obtained using this method achieve similar or better classification accuracy compared with other neural network structure search algorithms and manually designed networks, while effectively reducing computational time and resource requirements.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call