Abstract
Modern recommender systems are trained to predict users’ potential future interactions from users’ historical behavior data. During the interaction process, despite the data coming from the user side, recommender systems also generate exposure data to provide users with personalized recommendation slates. Compared with the sparse user behavior data, the system exposure data are much larger in volume since only very few exposed items would be clicked by the user. In addition, user historical behavior data are privacy sensitive and commonly protected with careful access authorization. However, the large volume of recommender exposure data generated by the service provider itself usually receives less attention and could be accessed within a relatively larger scope of various information seekers or even potential adversaries. In this article, we investigate the problem of user behavior data leakage in the field of recommender systems. We show that the privacy-sensitive user past behavior data can be inferred through the modeling of system exposure. In other words, one can infer which items the user has clicked just from the observation of current system exposure for this user . Given the fact that system exposure data could be widely accessed from a relatively larger scope, we believe that user past behavior privacy has a high risk of leakage in recommender systems. More precisely, we conduct an attack model whose input is the current recommended item slate (i.e., system exposure) for the user while the output is the user’s historical behavior. Specifically, we exploit an encoder-decoder structure to construct the attack model and apply different encoding and decoding strategies to verify attack performance. Experimental results on two real-world datasets indicate a great danger of user behavior data leakage. To address the risk, we propose a two-stage privacy-protection mechanism that first selects a subset of items from the exposure slate and then replaces the selected items with uniform or popularity-based exposure. Experimental evaluation reveals a trade-off effect between the recommendation accuracy and the privacy disclosure risk, which is an interesting and important topic for privacy concerns in recommender systems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.