Abstract
Sequential recommendation (SR) is an essential component of modern recommender systems. It models the dynamic interest of users based on their sequential interactions. Recently, several studies have utilized sequential deep learning models such as Recurrent Neural Networks and Transformers to facilitate sequential recommendation, which shows promising results. Inspired by the rise of contrastive learning techniques, some methods are devoted to enhance sequential deep learning models by designing contrastive learning loss on self-supervised signals. However, there are still a number of obstacles that make it challenging to efficiently learn user representations by contrastive learning. These issues include, but are not limited to, data sparsity, noisy data, and sampling bias (e.g., false negative), particularly in complex, parameter-intensive models.In light of these challenges, we examine how to deal with data sparsity and noisy data by implementing contrastive Self-Supervised Learning (SSL) and Momentum Contrast (MoCo) to the sequential recommendation. Except typical in-batch negatives, our basic idea is to maintain a dynamic queue to expand negative samples with a moving-averaged encoder. After being augmented by sequence-level and embedding-level methods, the representations from all historical encoder output are pushed into the dynamic queue, which usually leads to sampling bias when potential positives in the queue are used as expanded negative samples in contrastive learning. To tackle this issue, we integrate momentum updating mechanism with a novel instance weighting mechanism to penalize false negatives and guarantee the model’s efficacy. To this end, we introduce a fresh framework called Momentum Contrastive Learning Framework for Sequential Recommendation (MoCo4SRec). Experiments on eight real-world datasets demonstrate how the suggested strategy might boost model performance over current benchmarks by adopting improved user representations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.