Abstract

Contrastive learning has made breakthrough advancements in graph representation learning, which encourages the representation of positive samples to be close and those of negative samples to be far away. However, existing graph contrastive learning (GCL) frameworks have made great efforts toward designing different augmentation strategies for positive samples, while randomly utilizing all other nodes as negative samples and treating them equally, completely ignoring the differences between negative samples. Moreover, almost every GCL framework replaces original graph with different augmented views, which may lead to unexpected information missing caused by randomly perturbing edges and features. To address these issues, we propose a self-supervised graph Contrastive learning framework with Curriculum negative sampling, called ConCur, which feeds negative samples in an easy-to-hard fashion for contrastive learning by performing our proposed curriculum negative sampling strategy. More specifically, ConCur consists of two phases: Graph Augmentations and Curriculum Contrastive Training. Graph Augmentations aim at constructing positive and negative samples through different graph augmentation strategies. In the Curriculum Contrastive Training, we first utilize a triplet network to learn node representations by receiving original graph and different augmented views as input. Then, we propose a curriculum negative sampling strategy to enumerate negative samples from easy to hard for contrastive training. Finally, we utilize a unified contrastive loss to optimize node representations. Comprehensive experiments on five real-world datasets reveal that ConCur yields substantial relative encouraging results on the node classification task.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call