Abstract
When it comes to machine learning on graphs, Graph Neural Networks(GNNs) is a potent tool. By iteratively propagating neural messages along the edges of the input graph, GNNs integrate node feature information with graph structure. Nonetheless, not all neighborhood data is valuable. Some of the neighbor data may not be necessary for the downstream task. For a controlled and relevant information aggregation, we propose a flexible method OOF-GNN (Over-smoothing and Over-squashing Free Graph Neural Network). The main notion of this technique is to limit the impact of irrelevant and excessive information propagation during message passing. An extensive discussion of the theoretical and experimental justifications, associated with this method is presented. This technique works on all baseline models (e.g. GCN, GraphSAGE, and GAT). It allows them to gather information from the K-hop neighborhood without node information being over-smoothed and over-squashed. The efficacy of OFF-GNN in addressing the challenges of under-reaching, over-smoothing, and over-squashing, as well as improving the performance of GNN models in node classification tasks, has been demonstrated through comprehensive experimentation on both academic benchmarks and real-world datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.