Abstract

Differential privacy is a formal mathematical standard for quantifying the degree of that individual privacy in a statistical database is preserved. To guarantee differential privacy, a typical method is adding random noise to the original data for data release. In this paper, we investigate the basic conditions of differential privacy considering the general random noise adding mechanism, and then apply this result for privacy analysis of the privacy-preserving consensus algorithm. Specifically, we obtain a necessary and sufficient condition of differential privacy, which provides a useful and efficient criterion of achieving differential privacy. We utilize the result to analyze the privacy of some common random noises and the theory matches with the existing literature for special cases. Applying the theory, differential privacy property of a privacy-preserving consensus algorithm is investigated based on the proposed theory. We obtain the necessary condition of differential privacy for the privacy-preserving consensus algorithm. In addition, it is proved that the average consensus and differential privacy cannot be guaranteed simultaneously by any privacy-preserving consensus algorithm.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.