Abstract

Graph Neural Network(GNN) has achieved remarkable progress in the field of graph representation learning. The most prominent characteristic, propagating features along the edges, degrades its performance in most heterophilic graphs. Certain researches make attempts to construct KNN graph to improve the graph homophily. However, there is no prior knowledge to choose proper K and they may suffer from the problem of Inconsistent Similarity Distribution (ISD). To accommodate this issue, we propose Probability Graph Complementation Contrastive Learning (PGCCL) which adaptively constructs the complementation graph. We employ Beta Mixture Model(BMM) to distinguish intra-class similarity and inter-class similarity. Based on the posterior probability, we construct Probability Complementation Graphs to form contrastive views. The contrastive learning prompts the model to preserve complementary information for each node from different views. By combining original graph embedding and complementary graph embedding, the final embedding is able to capture rich semantics in the finetuning stage. At last, comprehensive experimental results on 20 datasets including homophilic and heterophilic graphs firmly verify the effectiveness of our algorithm as well as the quality of probability complementation graph compared with other state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call