Abstract

Graph Neural Networks (GNNs) have shown expressive modeling ability on graphs. An inherent assumption is that those connected graph nodes exhibit strong homophily, passing consistent label information into the associated central node representation. However, heterophilic graphs may hold inconsistent labels for those connected nodes, degenerating the performance of the typical message passing and exacerbating its over-smoothing issue. In this paper, to tame heterophilic graphs for GNNs, a novel label estimation-based message passing scheme is presented, which can further relieve the over-smoothing and lead the learned representations to be more distinguishable on graphs with either homophily or heterophily. Specifically, we optimize the edge weights of the graph by invoking the label estimation with the goal of returning a more effective adjacency matrix. With this matrix, a novel message passing scheme is presented, aggregating neighbor information with consistent labels to their central nodes. Meanwhile, a symmetric cross entropy is also employed to mitigate negative effects from noisy labels. With such guarantees in our new message passing, a uniform framework termed Label Estimation-based GNN (LE-GNN) is finally established. Extensive experiments demonstrate the ability of our model to prevent over-smoothing and show its effectiveness for node classification on both homophilic and heterophilic graphs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call