On the Geometry of Differential Privacy
We consider the noise complexity of differentially private mechanism in the setting where the user asks linear queries non-adaptively. Here, the database is represented by a vector in and proximity between databases is measured in the -metric. We show that the noise complexity is completely determined by two geometric parameters associated with the set of queries. We use this connection to give tight upper and lower bounds on the noise complexity for any . We show that for random linear queries of sensitivity 1, it is necessary and sufficient to add -error to achieve -differential privacy. We can achieve the same upper bound for non-random queries as well, assuming the truth of a deep conjecture from convex geometry, known as the Hyperplane conjecture. Our bound translates to error per answer. The best previous upper bound (Laplacian mechanism) gives a bound of per answer, while the best known lower bound was . In contrast, our lower bound is strong enough to separate the notion of differential privacy from the weaker notion of approximate differential privacy where an upper bound of can be achieved.
View on arXiv