Abstract

Matrix factorization is widely used in Recommender Systems. Although existing popular incremental matrix factorization methods are effectively in reducing time complexity, they simply assume that the similarity between items or users is invariant. For instance, they keep the item feature matrix unchanged and just update the user matrix without re-training the entire model. However, with the new users growing continuously, the fitting error would be accumulated since the extra distribution information of items has not been utilized. In this paper, we present an alternative and reasonable approach, with a relaxed assumption that the similarity between items (users) is relatively stable after updating. Concretely, utilizing the prediction error of the new data as the auxiliary features, our method updates both feature matrices simultaneously, and thus users' preference can be better modeled than merely adjusting one corresponded feature matrix. Besides, our method maintains the feature dimension in a smaller size through taking advantage of matrix sketching. Experimental results show that our proposal outperforms the existing incremental matrix factorization methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.