Abstract

Previous federated recommender systems are based on traditional matrix factorization, which can improve personalized service but are vulnerable to gradient inference attacks. Most of them adopt model averaging to fit the data heterogeneity of federated recommender systems, requiring more training costs. To address privacy and efficiency, we propose an efficient federated item similarity model for the heterogeneous recommendation, called FedIS, which can train a global item-based collaborative filtering model to eliminate user feature dependencies. Specifically, we extend the neural item similarity model to the federated model, where each client only locally optimizes the shared item feature matrix. We then propose a fast-convergent federated aggregation method inspired by meta-learning to address heterogeneous user updates and accelerate the convergence of global training. Furthermore, we propose a two-stage perturbation method to protect both local training and transmission while reducing communication costs. Finally, extensive experiments on four real-world datasets validate that FedIS can provide more competitive performance on federated recommendations. Our proposed method also shows significant training efficiency with less performance degradation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call