Abstract

Graph Neural Networks (GNNs) have drawn great attention in handling graph-structured data. To characterize the message-passing mechanism of GNNs, recent studies have established a unified framework that models the graph convolution operation as a graph signal denoising problem. While increasing interpretability, this framework often performs poorly on heterophilic graphs and also leads to shallow and fragile GNNs in practice. The key reason is that it encourages feature smoothness, but ignores the high-frequency information of node features. To address this issue, we propose a general framework for GNNs via relaxation of the smoothness regularization. In particular, it employs an information aggregation mechanism to learn the low- and high-frequency components adaptively from data, offering more flexible graph convolution operators compared to the smoothness-promoted framework. Theoretical analyses demonstrate that our framework can capture both low- and high-frequency information of node features, effectively. Experiments on nine benchmark datasets show that our framework achieves the state-of-the-art performance in most cases. Furthermore, it can be used to handle deep models and adversarial attacks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.