With the explosion of digital data collected from social apps, privacy protection regulations have been issued by almost all countries. Differential privacy is proposed as a successful technique to make use of these data, without leaking personal private data at the same time. In this paper, we investigate logical reasoning for differential privacy properties and propose model checking algorithms. We introduce the branching time temporal logic dpCTL* to specify differentially private properties. Several mechanisms in differential privacy are formalized as Markov chains or Markov decision processes. In our framework, we show that subtle privacy conditions are specified by dpCTL*. In order to verify privacy properties automatically, model checking problems are investigated. We develop a model checking algorithm for Markov chains. Model checking dpCTL* properties on Markov decision processes however is shown to be undecidable. Therefore, we propose a sound algorithm to verify these properties in Markov decision processes. We implement our model checking algorithm into a tool, based on the PRISM language, as well as show the experimental results and a case study of stream processing in differential privacy under the essential fairness assumptions.
Read full abstract