Abstract

Graph Neural Networks (GNNs) have been success-fully applied to a variety of graph analysis tasks. Some recent studies have demonstrated that decoupling neighbor aggregation and feature transformation helps to scale GNNs to large graphs. However, very large graphs, with billions of nodes and millions of features, are still beyond the capacity of most existing GNNs. In addition, when we are only interested in a small number of nodes (called target nodes) in a large graph, it is inefficient to use the existing GNNs to infer the labels of these few target nodes. The reason is that they need to propagate and aggregate either node features or predicted labels over the whole graph, which incurs high additional costs relative to the few target nodes. To solve the above challenges, in this paper we propose a novel scalable and effective GNN framework COSAL. In COSAL, we substitute the expensive aggregation with an efficient proximate node selection mechanism, which picks out the most important <tex>$K$</tex> nodes for each target node according to the graph topology. We further propose a fine-grained neighbor importance quantification strategy to enhance the expressive power of COSAL. Empirical results demonstrate that our COSAL achieves superior performance in accuracy, training speed, and partial inference efficiency. Remarkably, in terms of node classification accuracy, our model COSAL outperforms baselines by significant margins of 2.22&#x0025;, 2.23&#x0025;, and 3.95&#x0025; on large graph datasets Amazon2M, MAG-Scholar-C, and ogbn-papers100M, respectively.<sup>1</sup><sup>1</sup> Code available at https://github.com/joyce-x/COSAL.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.