Abstract

Recommendation systems are often challenged by the existence of cold-start items for which no previous rating is available. The standard content-based or collaborative-filtering recommendation approaches may address this problem by asking users to share their data with a central (cloud-based) server, which uses machine learning to predict appropriate ratings on such items. But users may be reluctant to have their (confidential) data shared. Federated learning has been lately capitalized on to address the privacy concerns by enabling an on-device distributed training of a single machine learning model. In this work, we propose a federated learning-based approach to address the item cold-start problem in recommendation systems. The originality of our solution compared to existing federated learning-based solutions comes from (1) applying federated learning specifically to the cold-start problem; (2) proposing a trust mechanism to derive trust scores for the potential recommenders, followed by a double deep Q learning scheduling approach that relies on the trust and energy levels of the recommenders to select the best candidates. Simulations on the MovieLens 1M and Epinions datasets suggest that our solution improves the accuracy of recommending cold-start items and reduces the RMSE, MAE and running time compared to five benchmark approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call