Abstract
Neural Architecture Search (NAS), aiming to automatically search for neural structure that performs the best, has attracted lots of attentions from both the academy and industry. However, most existing works assume each layer accepts a fixed number of inputs from previous layers, ignoring the flexibility of receiving inputs from an arbitrary number of previous layers. Allowing to receive inputs from an arbitrary number of layers benefits in introducing far more possible combinations of connections among layers, which may also result in much more complex structural relations in architectures. Existing works fail to capture structural correlations among different layers, thus limiting the ability to discover the optimal architecture. To overcome the weakness of existing methods, we study the NAS problem by assuming an arbitrary number of inputs for each layer and capturing the structural correlations among different layers in this paper. Nevertheless, besides the complex structural correlations, considering an arbitrary number of inputs for each layer may also lead to a fully connected structure with up to O(n <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> ) connections for n layers, posing great challenges to efficiently handle polynomial numbers of connections among different layers. To tackle this challenge, we propose a Graph Q Network for NAS (GQNAS), where the states and actions are redefined for searching architectures with input from an arbitrary number of layers. Concretely, we regard a neural architecture as a directed acyclic graph and use graph neural network (GNN) as the Q-function approximation in deep Q network (DQN) to capture the complex structural relations between different layers for obtaining accurate Q-values. Our extensive experiments show that the proposed GQNAS model is able to achieve better performances than several state-of-the-art approaches.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.