Abstract

The global structure of the hyperparameter spaces of neural networks is not well understood and it is therefore not clear which hyperparameter search algorithm will be most effective. In this paper we analyze the landscapes of convolutional neural network architecture search spaces to provide insight into appropriate search algorithms for these spaces. Using a classical fitness landscape analysis approach (fitness distance correlation) and a more recent tool (local optima networks) we study the global structure of these spaces. Our analysis on six image classification datasets reveals that the landscapes are multi-modal, but with relatively few local optima from which it is not hard to escape with a simple perturbation operator. This led us to explore the performance of iterated local search, which we found to more effectively search the training landscapes than three evolutionary algorithm variants. Evolutionary algorithms, however, outperformed iterated local search in terms of generalization on problems with larger discrepancies between the training and testing landscapes.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.