Abstract

Graph neural networks (GNNs) stack layers alternating graph convolutions and pointwise nonlinearities. Since graph convolutions are parametrized locally, their weights are independent of the underlying graph. This decouples the number of parameters and the graph size; however, it does not alleviate the computational cost of training GNNs on large graphs. In this paper, we address this by unveiling the ability of graph neural networks to transfer from small to large graphs. Relying on graphons as both graph limits and stochastic graph models, we define graphon neural networks (WNNs) and demonstrate that they can be used to sample GNNs. We derive a probabilistic bound for the difference between the outputs of the WNN and the GNN and, using a simple triangle inequality argument, compose it to derive the GNN transferability bound. When the convolutional filters are bandlimited in the spectral domain, this bound is seen to vanish with the size of the graph.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call