Abstract

Previous highly scalable One-Class Collaborative Filtering (OC-CF) methods such as Projected Linear Recommendation (PLRec) have advocated using fast randomized SVD to embed items into a latent space, followed by linear regression methods to learn personalized recommendation models per user. However, naive SVD embedding methods often exhibit a strong popularity bias that prevents them from accurately embedding less popular items, which is exacerbated by the extreme sparsity of implicit feedback matrices in the OC-CF setting. To address this deficiency, we leverage insights from Noise Contrastive Estimation (NCE) to derive a closed-form, efficiently computable depopularized embedding. We show that NCE item embeddings combined with a personalized user model from PLRec produces superior recommendations that adequately account for popularity bias. Further analysis of the popularity distribution of recommended items demonstrates that NCE-PLRec uniformly distributes recommendations over the popularity spectrum while other methods exhibit distinct biases towards specific popularity subranges. Empirically, NCE-PLRec produces highly competitive performance with run-times an order of magnitude faster than existing state-of-the-art approaches for OC-CF.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call