Abstract

AbstractThis paper investigates a comprehensive convolutional neural network (CNN) representation that encodes both layer connections, and computational block attributes for neural architecture search (NAS). We formulate NAS as a bi-objective optimization problem, where two competing objectives, i.e., the validation accuracy and the model complexity, need to be considered simultaneously. We employ the well-known multi-objective evolutionary algorithm (MOEA) nondominated sorting genetic algorithm II (NSGA-II) to perform multi-objective NAS experiments on the CIFAR-10 dataset. Our NAS runs obtain trade-off fronts of architectures of much wider ranges and better quality compared to NAS runs with less comprehensive representations. We also transfer promising architectures to other datasets, i.e., CIFAR-100, Street View House Numbers, and Intel Image Classification, to verify their applicability. Experimental results indicate that the architectures on the trade-off front obtained at the end of our NAS runs can be straightforwardly employed out of the box without any further modification.KeywordsMulti-objective optimizationDeep learningNeural architecture searchEvolutionary algorithms

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.