Abstract
Graph neural networks (GNNs) stack layers alternating graph convolutions and pointwise nonlinearities. Since graph convolutions are parametrized locally, their weights are independent of the underlying graph. This decouples the number of parameters and the graph size; however, it does not alleviate the computational cost of training GNNs on large graphs. In this paper, we address this by unveiling the ability of graph neural networks to transfer from small to large graphs. Relying on graphons as both graph limits and stochastic graph models, we define graphon neural networks (WNNs) and demonstrate that they can be used to sample GNNs. We derive a probabilistic bound for the difference between the outputs of the WNN and the GNN and, using a simple triangle inequality argument, compose it to derive the GNN transferability bound. When the convolutional filters are bandlimited in the spectral domain, this bound is seen to vanish with the size of the graph.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.