Abstract

The goal of session-based recommendation (SBR) is to predict the next item at a certain point in time for anonymous users. Previous methods usually learn session representations based on the item prediction loss, and session-based data consists of limited users’ short-term interactions. Consequently, the model trained with this loss often suffers from data sparsity. Contrastive learning can derive self-supervision signals from raw data, effectively alleviating this problem. However, existing contrastive recommendation approaches mainly generate self-supervision signals via feature mask, which is unfit for SBR because session-based data is too sparse to mine powerful self-supervision signals by masking features. In this paper, we propose a Contrastive Graph Self-Attention Network (abbreviated as CGSNet) for SBR. Specifically, we design three distinct graph encoders to capture different levels of item transition patterns, and obtain a collaborative session representation by aggregating these item representations related to the current session through an attention-based fusion module. Meanwhile, we devise a self-attention subnetwork to learn the complex item transition information, and obtain a local session representation by averaging the item representations within the current session. Since the above two session representations model a specific session from a global- and local-level perspective, respectively, we further introduce a contrastive learning paradigm to maximize the mutual information between the representations of collaborative session and local session to enhance the performance. Extensive experimental results on three widely used benchmark datasets validate the efficacy of our method, which outperforms the competing methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.