Abstract

In the field of recommendation algorithms, the representation learning for users and items has evolved from using single IDs or historical interactions to utilizing higher-order neighbors. This can be achieved by modeling the user–item interaction graph to capture user preferences for items. Despite the promising results achieved by these algorithms, they still suffer from the issue of data sparsity. In order to mitigate the impact of data sparsity, contrastive learning has been adopted in graph collaborative filtering to enhance performance. However, current recommendation algorithms using contrastive learning yield uneven representations after data augmentation and do not consider the potential relationships among users (or items). To address these challenges, we propose a graph neural network-based recommendation model that integrates contrastive learning (GNNCL). This model combines data augmentation with added noise and the exploration of semantic neighbors for nodes. For the structural neighbors on the interaction graph, we introduce a novel and straightforward contrastive learning approach, abandoning previous graph augmentation methods, and introducing uniform noise into the embedding space to create contrastive views. To unearth potential semantic neighbor relationships in the semantic space, we assume that users with similar representations possess semantic neighbor relationships and merge these semantic neighbors into the prototype contrastive learning. We utilize a clustering algorithm to obtain prototypes for users and items and employ the EM algorithm for prototype contrastive learning. Experimental results validate the effectiveness of our approach. Particularly, on the Yelp2018 and Amazon-book datasets, our method exhibits significant performance improvements compared to basic graph collaborative filtering models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call