Abstract

AbstractThis paper investigates a comprehensive convolutional neural network (CNN) representation that encodes both layer connections, and computational block attributes for neural architecture search (NAS). We formulate NAS as a bi-objective optimization problem, where two competing objectives, i.e., the validation accuracy and the model complexity, need to be considered simultaneously. We employ the well-known multi-objective evolutionary algorithm (MOEA) nondominated sorting genetic algorithm II (NSGA-II) to perform multi-objective NAS experiments on the CIFAR-10 dataset. Our NAS runs obtain trade-off fronts of architectures of much wider ranges and better quality compared to NAS runs with less comprehensive representations. We also transfer promising architectures to other datasets, i.e., CIFAR-100, Street View House Numbers, and Intel Image Classification, to verify their applicability. Experimental results indicate that the architectures on the trade-off front obtained at the end of our NAS runs can be straightforwardly employed out of the box without any further modification.KeywordsMulti-objective optimizationDeep learningNeural architecture searchEvolutionary algorithms

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call