Abstract

Graph Neural Networks (GNNs) have emerged as a crucial deep learning framework for graph-structured data. However, existing GNNs suffer from the scalability limitation, which hinders their practical implementation in industrial settings. Many scalable GNNs have been proposed to address this limitation. However, they have been proven to act as low-pass graph filters, which discard the valuable middle- and high-frequency information. This paper proposes a novel graph neural network named Adaptive Filtering Graph Neural Networks (AFGNN), which can capture all frequency information on large-scale graphs. AFGNN consists of two stages. The first stage utilizes low-, middle-, and high-pass graph filters to extract comprehensive frequency information without introducing additional parameters. This computation is a one-time task and is pre-computed before training, ensuring its scalability. The second stage incorporates a node-level attention-based feature combination, enabling the generation of customized graph filters for each node, contrary to existing spectral GNNs that employ uniform graph filters for the entire graph. AFGNN is suitable for mini-batch training, and can enhance scalability and efficiently capture all frequency information from large-scale graphs. We evaluate AFGNN by comparing its ability to capture all frequency information with spectral GNNs, and its scalability with scalable GNNs. Experimental results illustrate that AFGNN surpasses both scalable GNNs and spectral GNNs, highlighting its superiority.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call