Abstract

Differential privacy is a formal mathematical framework for quantifying the degree of individual privacy in a statistical database.To guarantee differential privacy, a typical method is to add random noise to the original data for data release. In this paper, we investigate the conditions of differential privacy (single-dimensional case) considering the general random noise adding mechanism, and then apply the obtained results for privacy analysis of the privacy-preserving consensus algorithm. Specifically, we obtain a necessary and sufficient condition of $\epsilon$ -differential privacy, and the sufficient conditions of $(\epsilon, \delta)$ -differential privacy. We apply them to analyze various random noises. For the special cases with known results, our theory not only matches with the literature, but also provides an efficient approach to the privacy parameters’ estimation; for other cases that are unknown, our approach provides a simple and effective tool for differential privacy analysis. Applying the obtained theory on privacy-preserving consensus algorithm, we obtain the necessary condition and the sufficient condition to ensure differential privacy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call