ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.00650
36
10
v1v2v3v4v5v6 (latest)

Differential privacy with partial knowledge

2 May 2019
Damien Desfontaines
Esfandiar Mohammadi
Elisabeth Krahmer
David Basin
ArXiv (abs)PDFHTML
Abstract

Differential privacy offers formal quantitative guarantees for algorithms over datasets, but it assumes attackers that know and can influence all but one record in the database. This assumption often vastly overapproximates the attackers' actual strength, resulting in unnecessarily poor utility. Recent work has made significant steps towards privacy in the presence of partial background knowledge, which can model a realistic attacker's uncertainty. Prior work, however, has definitional problems for correlated data and does not precisely characterize the underlying attacker model. We propose a practical criterion to prevent problems due to correlations, and we show how to characterize attackers with limited influence or only partial background knowledge over the dataset. We use these foundations to analyze practical scenarios: we significantly improve known results about the privacy of counting queries under partial knowledge, and we show that thresholding can provide formal guarantees against such weak attackers, even with little entropy in the data. These results allow us to draw novel links between k-anonymity and differential privacy under partial knowledge. Finally, we prove composition results on differential privacy with partial knowledge, which quantifies the privacy leakage of complex mechanisms. Our work provides a basis for formally quantifying the privacy of many widely-used mechanisms, e.g. publishing the result of surveys, elections or referendums, and releasing usage statistics of online services.

View on arXiv
Comments on this paper