Abstract

Privacy-preserving data statistics and analysis has become an urgent problem nowadays. Differential privacy (DP), as a rigorous privacy paradigm, has been widely adopted in various fields. However, in the context of large-scale mobile applications where each user has multiple records, both user-level DP and record-level DP cannot achieve a good compromise between stringent privacy and high data utility. A more satisfying privacy paradigm with desired granularity becomes very necessary. To this end, this paper proposes a fine-grained privacy paradigm called α-event-set differential privacy, which prevents adversaries from inferring any one of α event-sets owned by the user in data statistics and analysis. We theoretically introduce the definition, properties, and baseline mechanisms of α-event-set DP. Besides, we implement and evaluate α-event-set DP on mean estimation, histogram estimation, and machine learning applications, respectively. The experimental results have shown that α-event-set DP is able to achieve a fine-grained granularity of privacy protection while allowing high data utility.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call