Abstract

Item recommendation helps people to discover their potentially interested items among large numbers of items. One most common application is to recommend top-n items on implicit feedback datasets (e.g., listening history, watching history or visiting history). In this paper, we assume that the implicit feedback matrix has local property, where the original matrix is not globally low rank but some sub-matrices are low rank. In this paper, we propose Local Weighted Matrix Factorization (LWMF) for top-n recommendation by employing the kernel function to intensify local property and the weight function to model user preferences. The problem of sparsity can also be relieved by sub-matrix factorization in LWMF, since the density of sub-matrices is much higher than the original matrix. We propose a heuristic method to select sub-matrices which approximate the original matrix well. The greedy algorithm has approximation guarantee of factor \(1-\frac{1}{e}\) to get a near-optimal solution. The experimental results on two real datasets show that the recommendation precision and recall of LWMF are both improved about 30% comparing with the best case of weighted matrix factorization (WMF).

Highlights

  • MF [4] projects users and items into a latent low-dimensional space

  • Based on kernel function, we propose DCGASC (Discounted Cumulative Gain Anchor Point Set Cover) to select the sub-matrices in order to approximate the original matrix better

  • Based on item recommendation problem, we further propose a variant method user-based Local Weighted Matrix Factorization (LWMF), which is more reasonable for item recommendation and get better performance

Read more

Summary

Introduction

MF [4] projects users and items into a latent low-dimensional space. Further, the missing entries in the original matrix can be recovered using the dot product between user and item latent vectors. We propose Local Weighted Matrix Factorization (LWMF), integrating LLORMA [7] with WMF [10] in recommending by employing the kernel function to intensify local property and the weight function to intensify modeling user preference. We propose a heuristic method DCGASC to select sub-matrices which approximate the original matrix well. We propose LWMF which integrates LLORMA with WMF to recommend items on implicit feedback datasets. LWMF utilizes the local property to model the matrix by dividing the original matrix into submatrices and relieves the sparsity problem. Based on kernel function, we propose DCGASC (Discounted Cumulative Gain Anchor Point Set Cover) to select the sub-matrices in order to approximate the original matrix better. 4 including the heuristic method DCGASC to select sub-matrices and the learning algorithm of local latent vectors.

Related Work
Preliminary
Matrix Factorization
Weighted Matrix Factorization
Low-rank Matrix Approximation
Our Proposed Model
Anchor Point Set Selection
Learning Algorithm
User-based Local Weighted Matrix Factorization
Dataset
Setting
Recommendation Methods Comparison
Comparison with Different Number of Anchor Points
Anchor Point Set Selection Methods Comparison
Comparison with Different Discounts for DCGASC
Conclusion and Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.