Abstract

Recently, Graph Neural Networks (GNNs) have shown great promise in addressing various tasks with non-Euclidean data. Encouraged by the successful application on discovering convolutional and recurrent neural networks, Neural Architecture Search (NAS) is extended to alleviate the complexity of designing appropriate task-specific GNNs. Unfortunately, existing graph NAS methods are usually susceptible to unscalable depth, redundant computation, constrained search space and some other limitations. In this paper, we present an evolutionary graph neural network architecture search strategy, involving inheritance, crossover and mutation operators based on fine-grained atomic operations. Specifically, we design two novel crossover operators at different granularity levels, GNNCross and LayerCross. Experiments on three different graph learning tasks indicate that the neural architectures generated by our method exhibit comparable performance to the handcrafted and automated baseline GNN models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call