Abstract

Privacy preserving methods supporting for tuple data release have attracted the attention of researchers in multidisciplinary fields. Among the advanced methods, differential privacy (DP), introducing independent Laplace noise, has become an influential privacy mechanism owing to its provable and rigorous privacy guarantee. Nonetheless, in practice, tuple data to be protected are always correlated while independent noise may cause undesirable information disclosure than expected. Recent researches attempt to optimize the sensitivity function of DP with consideration of the correlation strength between data – but have a drawback in a substantial growth of noise level. Therefore, for correlated tuple data release, how to decrease the noise level incurred by correlation strength is yet to be explored. To remedy this problem, this paper exploits the degradation of DP in expected privacy levels for correlated tuple data and proposes a solution to mitigate it. We first demonstrate a filtering attack, presenting a possibility of using the different dependence between original outputs and perturbations to sanitize a certain level of noise to extract individual’s sensitive information. Secondly, we introduce the notion of correlated tuple differential privacy (CTDP) to preserve expected privacy for correlated tuple data and further propose a generalized Laplace mechanism (GLM) to achieve privacy guarantees in CTDP. Then we design a practical iteration mechanism, including an update function, to conduct GLM when facing large scale queries. Finally, experimental evaluation on real-world datasets over multiple fields show that our solution consistently outperforms state-of-the-art mechanisms in data utility while providing the same privacy guarantee as other approaches for correlated tuple data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call