Abstract

Message Passing Neural Networks (MPNNs) is a promising architecture for machine learning on graphs, which iteratively propagates the information among nodes. The existing MPNNs methods are more suitable for homophily graphs in which the geometrically close nodes have similar features and class labels. However, in real-world applications, there exist graphs with heterophily and the performance of MPNNs may be limited when dealing with the heterophily graphs. We analyze the limitations of MPNNs when facing heterophily graphs and owe it to the indistinguishability of nodes during aggregating and combining. To this end, we propose a method under the MPNNs architecture called Position Enhanced Message Passing model (PEMP) that endows the node with position information to make the node distinguishable. Extensive experiments on nine real-world datasets show that our method achieves state-of-the-art performances in most heterophily graphs while preserving the performances of MPNNs on homophily graphs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call