Abstract

Representation learning of users and items is the core of recommendation, and benefited from the development of graph neural network (GNN), graph collaborative filtering (GCF) for capturing higher order connectivity has been successful in the recommendation domain. Nevertheless, the matrix sparsity problem in collaborative filtering and the tendency of higher order embeddings to smooth in GNN limit further performance improvements. Contrastive learning (CL) was introduced into GCF and alleviated these problems to some extent. However, existing methods usually require graph perturbation to construct augmented views or design complex CL tasks, which limits the further development of CL-based methods in the recommendation. We propose a simple CL framework that does not require graph augmentation, but is based on dropout techniques to generate contrastive views to address the aforementioned problem. Specifically, we first added dropout operation to the GNN computation, and then fed the same batch of samples twice into the network for computation. Using the randomness of dropout, a pair of views with random noise was obtained, and maximizing the similarity of the view pairs is set as an auxiliary task to complement the recommendation. In addition, we made a simple modification to the computation of the GNN to alleviate the information loss due to embedding smoothing by means of cross-layer connected graph convolution computation. We named our proposed method as Simple Contrastive Learning Graph Neural Network based on dropout (SimDCL). Extensive experiments on five public datasets demonstrate the effectiveness of the proposed SimDCL, especially on the Amazon Books and Ta-Feng datasets, where our approach achieves 44% and 43% performance gains compared to baseline.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call