Abstract

Matrix factorization (MF) techniques have yielded immense success in recommender systems (RSs). Since a huge amount of user data is collected and used in RS, it raises concerns about data privacy. As a strict privacy protection framework, many efforts attempt to apply Differential Privacy (DP) to MF. However, there are still some challenges or problems in designing MF with privacy preservation, such as error accumulation in the multiple iterations of MF, introduction of unnecessary noises, and difficult sensitivity analysis. To overcome these problems, we devise a vector perturbation-based differentially private matrix factorization (VP-DPMF). Our scheme can prevent error accumulation by perturbing the objective function of MF rather than its factorization process or results. It also addresses the difficulty of analyzing sensitivity by exploiting the polynomial representation of the objective function. Furthermore, our scheme can reduce unnecessary noises by controlling the perturbation within the vector term of the polynomial, and can preserve the convexity property of the original function. Theoretical analysis demonstrates that our scheme can achieve good performance in a large-scale recommender system. Experimental results on some benchmark datasets show that the proposed scheme can provide both rigid privacy guarantee and satisfactory recommendation quality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call