Abstract
Privacy-preserving data statistics and analysis has become an urgent problem nowadays. Differential privacy (DP), as a rigorous privacy paradigm, has been widely adopted in various fields. However, in the context of large-scale mobile applications where each user has multiple records, both user-level DP and record-level DP cannot achieve a good compromise between stringent privacy and high data utility. A more satisfying privacy paradigm with desired granularity becomes very necessary. To this end, this paper proposes a fine-grained privacy paradigm called α-event-set differential privacy, which prevents adversaries from inferring any one of α event-sets owned by the user in data statistics and analysis. We theoretically introduce the definition, properties, and baseline mechanisms of α-event-set DP. Besides, we implement and evaluate α-event-set DP on mean estimation, histogram estimation, and machine learning applications, respectively. The experimental results have shown that α-event-set DP is able to achieve a fine-grained granularity of privacy protection while allowing high data utility.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.