Abstract

Self-supervised learning (SSL) has been proposed in machine learning projects for its convenience of reducing self-labeled datasets in recent years. However, the implementation of SSL in large-scale recommendation systems has lagged behind the evolution because of their scarce and tailed characteristics. In 2021, an article proposed the use of SSL in recommendation systems to pursue an improvement in the performance of recommender models. This article is built on this previous investigation and aims to further explore the role of SSL in recommendation systems and to investigate an improvement of the models efficiency. To answer research questions, this paper tests three models with different numbers of towers to discover the best performance of the use of SSL in recommender models. Consequently, it is found that implementing SSL on the item side only (two-tower DNNs) produced the best result. Then, when constructing the two-tower DNNs model, this article examines different numbers of negative pairs to change the InfoNCE loss to investigate a tradeoff between the number of positive and negative samples in the performance of the model. As a result, it turns out to be a weak correlation between this ratio and the performance; Hence, it is concluded that the change of the number of positive and negative samples would not necessarily affect the two-tower DNNs model. In our experiential stage, this paper uses a real-world dataset with 100k training samples to testify and compare our results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call