Abstract
Neural Architecture Search (NAS) is the name given to a set of methods designed to automatically configure the layout of neural networks. Their success on Convolutional Neural Networks inspired its use on optimizing other types of neural network architectures, including Graph Neural Networks (GNNs). GNNs have been extensively applied over several collections of real-world data, achieving state-of-the-art results in tasks such as circuit design, molecular structure generation and anomaly detection. Many GNN models have been recently proposed, and choosing the best model for each problem has become a cumbersome and error-prone task. Aiming to alleviate this problem, recent works have proposed strategies for applying NAS to GNN models. However, different search methods converge relatively fast in the search for a good architecture, which raises questions about the structure of the problem. In this work we use Fitness Landscape Analysis (FLA) measures to characterize the search space explored by NAS methods for GNNs. We sample almost 90k different architectures that cover most of the fitness range, and represent them using both a one-hot encoding and an embedding representation. Results of the fitness distance correlation and dispersion metrics show the fitness landscape is easy to be explored, and presents low neutrality.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.