Abstract

SummaryRecommender systems are important applications in big data analytics because accurate recommendation items or high‐valued suggestions can bring high profit to both commercial companies and customers. To make precise recommendations, a recommender system often needs large and fine‐grained data for training. In the current big data era, data often exists in the form of isolated islands, and it is difficult to integrate the data scattered due to privacy security concerns. Moreover, privacy laws and regulations make it harder to share data. Therefore, designing a privacy‐preserving recommender system is of paramount importance. Existing privacy‐preserving recommender system models mainly adapt cryptography approaches to achieve privacy preservation. However, cryptography approaches have heavy overhead when performing encryption and decryption operations and they lack a good level of flexibility. In this paper, we conduct privacy analysis on the existing locality sensitive hashing (LSH) approach based privacy‐preserving recommender system and show how an attacker can retrieve user's information under such a recommender system. Given such privacy risks, we propose differentially private LSH approach to build recommender system that can offer differential privacy guarantees for users. Our proposed efficient and scalable federated recommender system can make full use of multiple source data from different data owners while guaranteeing privacy preservation of users' data in contributing parties. Extensive experiments on real‐world benchmark datasets show that our approach can achieve both high time efficiency and accuracy under small privacy budgets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call