Abstract

Recommender system is one of the most effective tools to solve the problem of information overload. As a popular method for recommendation system, regularized singular value decomposition (RSVD) has the advantage of prediction accuracy. However, with the growing of the size of rating matrix A, RSVD suffers from both ‘out of memory’ and high computational cost. To alleviate these disadvantages, we utilize CUR decomposition approach to reduce the memory consumption before RSVD is applied. Additionally, the rating matrix A often is sparse, so we propose a novel column sampling algorithm and sparseness measure to solve the data sparse problem. The main computation cost in the original RSVD is to compute and respectively. While replacing N by CUR or C in RSVD, we can reduce the computational cost from to (where c is the size of data subspace, m and n are the size of the input matrix, t is the number of iterations, is the number of features). Because CUR is explicitly expressed in terms of a small number of actual columns and actual rows of the original data matrix, the result of matrix decomposition has better interpretability. The advantage of what we devised CUR+RSVD and C+RSVD collaborative prediction approaches is that, they not only can deal with the large scale matrix rapidly but also preserve the sparsity of the original matrix, more interestingly, our approaches have higher prediction accuracy. Experimental results on Movielens, Joke data set, etc. show that what we proposed methods can handle the sparsity issue on large-scale low-rank matrix effectively. Compared with RSVD, it can achieve far better prediction accuracy and recommendation result, meanwhile save about 70% training time at the same dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call