ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.06009
14
81

Towards Formalizing the GDPR's Notion of Singling Out

12 April 2019
A. Cohen
Kobbi Nissim
ArXivPDFHTML
Abstract

There is a significant conceptual gap between legal and mathematical thinking around data privacy. The effect is uncertainty as to which technical offerings adequately match expectations expressed in legal standards. The uncertainty is exacerbated by a litany of successful privacy attacks, demonstrating that traditional statistical disclosure limitation techniques often fall short of the sort of privacy envisioned by legal standards. We define predicate singling out, a new type of privacy attack intended to capture the concept of singling out appearing in the General Data Protection Regulation (GDPR). Informally, an adversary predicate singles out a dataset XXX using the output of a data release mechanism M(X)M(X)M(X) if it manages to find a predicate ppp matching exactly one row x∈Xx \in Xx∈X with probability much better than a statistical baseline. A data release mechanism that precludes such attacks is secure against predicate singling out (PSO secure). We argue that PSO security is a mathematical concept with legal consequences. Any data release mechanism that purports to "render anonymous" personal data under the GDPR must be secure against singling out, and hence must be PSO secure. We then analyze PSO security, showing that it fails to self-compose. Namely, a combination of ω(log⁡n)\omega(\log n)ω(logn) exact counts, each individually PSO secure, enables an attacker to predicate single out. In fact, the composition of just two PSO-secure mechanisms can fail to provide PSO security. Finally, we ask whether differential privacy and kkk-anonymity are PSO secure. Leveraging a connection to statistical generalization, we show that differential privacy implies PSO security. However, kkk-anonymity does not: there exists a simple and general predicate singling out attack under mild assumptions on the kkk-anonymizer and the data distribution.

View on arXiv
Comments on this paper