Abstract

Although massive real-time data collected from users can provide benefits to improve the quality of human daily lives, it is possible to expose users' privacy. $\epsilon$-differential privacy is a notable model to provide strong privacy preserving in statistics. The existing works highlight $ω$-event differential privacy with a fixed window size, which may not be suitable for many practical scenarios. In view of this issue, we explore a real-time scheme with adaptive $ω$-event for differentially private time-series publishing (ADP) in this paper. In specific, we define a novel notion, Quality of Privacy (QoP) to measure both the utility of the released statistics and the performance of privacy preserving. According to this, we present an adaptive $ω$-event differential privacy model that can provide privacy protection with higher accuracy and better privacy protection effect. In addition, we also design a smart grouping mechanism to improve the grouping performance, and then improve the availability of publishing statistics. Finally, comparing with the existing schemes, we exploit real-world and synthetic datasets to conduct several experiments to demonstrate the superior performance of the ADP scheme.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call