Abstract

Differential privacy mechanisms can offer a trade-off between privacy and utility by using privacy metrics and utility metrics. The trade-off of differential privacy shows that one thing increases and another decreases in terms of privacy metrics and utility metrics. However, there is no unified trade-off measurement of differential privacy mechanisms. To this end, we proposed the definition of privacy-preserving monotonicity of differential privacy, which measured the trade-off between privacy and utility. First, to formulate the trade-off, we presented the definition of privacy-preserving monotonicity based on computational indistinguishability. Second, building on privacy metrics of the expected estimation error and entropy, we theoretically and numerically showed privacy-preserving monotonicity of Laplace mechanism, Gaussian mechanism, exponential mechanism, and randomized response mechanism. In addition, we also theoretically and numerically analyzed the utility monotonicity of these several differential privacy mechanisms based on utility metrics of modulus of characteristic function and variant of normalized entropy. Third, according to the privacy-preserving monotonicity of differential privacy, we presented a method to seek trade-off under a semi-honest model and analyzed a unilateral trade-off under a rational model. Therefore, privacy-preserving monotonicity can be used as a criterion to evaluate the trade-off between privacy and utility in differential privacy mechanisms under the semi-honest model. However, privacy-preserving monotonicity results in a unilateral trade-off of the rational model, which can lead to severe consequences.

Highlights

  • In the non-interactive model or interactive model of computation to a database [1], a data curator’s sensitive data needs to be protected, and a data analyst can get the availability of data for statistical analysis

  • We proposed the definition of privacy-preserving monotonicity of differential privacy based on privacy metrics and utility metrics of this paper, which is a measurement of the trade-off between privacy and utility

  • According to the privacy metrics of expected estimation error and entropy, our theoretical and numerical results showed that several differential privacy mechanisms, such as Laplace mechanism, Gaussian mechanism, exponential mechanism, bivariate randomized response mechanism, and multivariate randomized response mechanism, ensure privacy-preserving monotonicity

Read more

Summary

Introduction

In the non-interactive model or interactive model of computation to a database [1], a data curator’s sensitive data needs to be protected, and a data analyst can get the availability of data for statistical analysis. It is easy to protect the privacy of sensitive data by adding random noise to the query results. The availability of data cannot be guaranteed for data analysts. For the research of privacy preservation in statistical queries, a crucial question is the trade-off between privacy and utility [2]. The key challenge is to provide the trade-off between data privacy and data utility of the computation to a database [3,4], which is achieved by using differential privacy. In this paper, expected estimation error [5] and entropy are

Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call