Abstract

Semi-supervised node classification is a task of predicting the labels of unlabeled nodes using limited labeled nodes and numerous unlabeled nodes. Recently, Graph Neural Networks (GNNs) have achieved remarkable success in this task. However, GNNs typically have shallow architectures and only consider labeled nodes and their low-order neighbors during training. As a result, the supervision signals from the massive unlabeled nodes remain underutilized. To address this limitation, graph contrastive learning has been applied in semi-supervised node classification, which pulls together positive nodes and pushes away negative nodes in the embedding space. Nevertheless, existing node-level contrastive learning methods usually sample the same node from two augmented views as positive nodes and all different nodes as negative nodes. Consequently, many semantically similar nodes are not sampled as positive nodes but are mistakenly sampled as negative nodes. To tackle this issue, we propose a novel Label-guided Graph Contrastive Learning (LGGCL) training algorithm for semi-supervised node classification. Specifically, we first propose a Label-guided Graph Contrastive Learning framework as the basis of LGGCL training algorithm. Then we incorporate a self-checking mechanism based on deep clustering to ensure the authenticity of the sampled positive nodes. Moreover, we design a reweighting strategy based on the probability distribution of the anchor node to enhance the effect of hard negative nodes. Finally, experimental results on various graph benchmarks demonstrate the superiority of our LGGCL.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call