Abstract

We consider the noise complexity of differentially private mechanisms in the setting where the user asks d linear queries f:Rn -> R non-adaptively. Here, the database is represented by a vector in R and proximity between databases is measured in the l1-metric. We show that the noise complexity is determined by two geometric parameters associated with the set of queries. We use this connection to give tight upper and lower bounds on the noise complexity for any d ≤ n. We show that for d random linear queries of sensitivity 1, it is necessary and sufficient to add l2-error Θ(min d√d/e,d√(log (n/d))/e) to achieve e-differential privacy. Assuming the truth of a deep conjecture from convex geometry, known as the Hyperplane conjecture, we can extend our results to arbitrary linear queries giving nearly matching upper and lower bounds.Our bound translates to error $O(min d/e,√(d log(n/d)/e)) per answer. The best previous upper bound (Laplacian mechanism) gives a bound of O(min (d/e,√n/e)) per answer, while the best known lower bound was Ω(√d/e).In contrast, our lower bound is strong enough to separate the concept of differential privacy from the notion of approximate differential privacy where an upper bound of O(√{d}/e) can be achieved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call