Abstract

With the development of Intelligent Transportation Systems (ITS), a vast amount of location data is being generated from various IoT devices equipped with location positioning sensors. Preserving the privacy of location data release is a critical concern, as the publication of aggregated data often reveals private information about the users. Differential Privacy (DP) has recently emerged as a robust framework to guarantee privacy in this context. However, conventional DP mechanisms commonly make no assumption about the distribution of the input data, which could lead to unexpected privacy leakage if the data are correlated. In this paper, we investigate the complex simultaneous impact of user correlation, spatial–temporal correlation and prior knowledge of an adversary on the privacy leakage of a DP mechanism, which has not been addressed in prior work. We derive several closed-form expressions that demonstrate and quantify the privacy leakage under correlated location data, followed by the design of efficient algorithms to compute such privacy leakage. Then, we propose a Δ-CDP (Correlated Differential Privacy) to provide a formal privacy guarantee against the additional privacy leakage incurred by these factors. Extensive comparisons, theoretical analysis, and experimental simulations are presented to validate the correctness and efficiency of the proposed work.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call