Abstract

The architecture and parameters of convolutional neural networks have an important impact on their performance. To overcome the difficulties of most existing neural architecture search (NAS) methods, including fixed network architecture and huge computing cost, this paper proposes a block-encoding based on neural architecture evolving method. A new block-encoding method is designed to divide the convolutional neural network architecture into blocks consisting of multiple functional layers. Efficient mutating operation is designed to speed up evolutionary search and expand the evolution space of network architecture. Finally, the optimal evolved network is converted into an all-convolutional neural network with fewer parameters and more concise architecture. The experiments on image datasets indicate that the proposed method can greatly reduce network parameters and searching time, achieve competitive classification accuracy and directly obtain the corresponding all-convolutional neural network architecture.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call