When it comes to machine learning on graphs, Graph Neural Networks(GNNs) is a potent tool. By iteratively propagating neural messages along the edges of the input graph, GNNs integrate node feature information with graph structure. Nonetheless, not all neighborhood data is valuable. Some of the neighbor data may not be necessary for the downstream task. For a controlled and relevant information aggregation, we propose a flexible method OOF-GNN (Over-smoothing and Over-squashing Free Graph Neural Network). The main notion of this technique is to limit the impact of irrelevant and excessive information propagation during message passing. An extensive discussion of the theoretical and experimental justifications, associated with this method is presented. This technique works on all baseline models (e.g. GCN, GraphSAGE, and GAT). It allows them to gather information from the K-hop neighborhood without node information being over-smoothed and over-squashed. The efficacy of OFF-GNN in addressing the challenges of under-reaching, over-smoothing, and over-squashing, as well as improving the performance of GNN models in node classification tasks, has been demonstrated through comprehensive experimentation on both academic benchmarks and real-world datasets.
Read full abstract