ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 0811.2501
70
482

A statistical framework for differential privacy

16 November 2008
Larry A. Wasserman
Shuheng Zhou
ArXivPDFHTML
Abstract

One goal of statistical privacy research is to construct a data release mechanism that protects individual privacy while preserving information content. An example is a {\em random mechanism} that takes an input database XXX and outputs a random database ZZZ according to a distribution Qn(⋅∣X)Q_n(\cdot|X)Qn​(⋅∣X). {\em Differential privacy} is a particular privacy requirement developed by computer scientists in which Qn(⋅∣X)Q_n(\cdot |X)Qn​(⋅∣X) is required to be insensitive to changes in one data point in XXX. This makes it difficult to infer from ZZZ whether a given individual is in the original database XXX. We consider differential privacy from a statistical perspective. We consider several data release mechanisms that satisfy the differential privacy requirement. We show that it is useful to compare these schemes by computing the rate of convergence of distributions and densities constructed from the released data. We study a general privacy method, called the exponential mechanism, introduced by McSherry and Talwar (2007). We show that the accuracy of this method is intimately linked to the rate at which the probability that the empirical distribution concentrates in a small ball around the true distribution.

View on arXiv
Comments on this paper