Abstract

It has been proven that graph neural networks (GNNs) are effective for a variety of graph learning-based applications. Typical GNNs iteratively aggregate messages from immediate neighbors under the homophily assumption. However, there are a larger number of heterophilous networks in our daily life, where the ability of these GNNs is limited. Recently, some GNN models have been proposed to handle networks with heterophily via some key designs like aggregating higher-order neighbors and combining immediate representations. But ”noise” information transmitted from different-order neighbors will be injected into the representations of nodes. In this paper, we propose a new GNN model, called KNN-GNN, to effectively perform the node classification task for networks with various homophily levels. The main idea of KNN-GNN is to learn a comprehensive and accurate representation for each node by integrating not only the local information from its neighborhood but also the non-local information held by its similar nodes decentralized in the network. Specifically, the local information of a node is generated from itself and its 1-hop neighbor. Then, we project all nodes into a common subspace, where similar nodes are desired to be close to each other. The non-local information of a node is gathered by aggregating its K-Nearest Neighbors searched in the common subspace. We evaluate the performance of KNN-GNN on both real and synthetic datasets including networks with diverse homophily levels. The results demonstrate that KNN-GNN outstrips the state-of-the-art baselines. Moreover, the ablation experiments show that the core designs in KNN-GNN play a critical role in node representation learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call