ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1302.3203
83
102
v1v2v3v4 (latest)

Local Privacy and Statistical Minimax Rates

13 February 2013
John C. Duchi
Michael I. Jordan
Martin J. Wainwright
    FedML
ArXiv (abs)PDFHTML
Abstract

Working under a model of privacy in which data remains private even from the statistician, we study the tradeoff between privacy guarantees and the utility of the resulting statistical estimators. We prove bounds on information-theoretic quantities, including mutual information and Kullback-Leibler divergence, that influence estimation rates as a function of the amount of privacy preserved. When combined with standard minimax techniques such as Le Cam's and Fano's methods, these inequalities allow for a precise characterization of statistical rates under local privacy constraints. We provide a treatment of several canonical problem families: mean estimation, parameter estimation in fixed-design regression, multinomial probability estimation, and non-parametric density estimation. For all of these families, we provide lower and upper bounds that match up to constant factors, giving privacy-preserving mechanisms and computationally efficient estimators that achieve the bounds.

View on arXiv
Comments on this paper