23
1

Dirichlet Mechanism for Differentially Private KL Divergence Minimization

Abstract

Given an empirical distribution f(x)f(x) of sensitive data xx, we consider the task of minimizing F(y)=DKL(f(x)y)F(y) = D_{\text{KL}} (f(x)\Vert y) over a probability simplex, while protecting the privacy of xx. We observe that, if we take the exponential mechanism and use the KL divergence as the loss function, then the resulting algorithm is the Dirichlet mechanism that outputs a single draw from a Dirichlet distribution. Motivated by this, we propose a R\ényi differentially private (RDP) algorithm that employs the Dirichlet mechanism to solve the KL divergence minimization task. In addition, given f(x)f(x) as above and y^\hat{y} an output of the Dirichlet mechanism, we prove a probability tail bound on DKL(f(x)y^)D_{\text{KL}} (f(x)\Vert \hat{y}), which is then used to derive a lower bound for the sample complexity of our RDP algorithm. Experiments on real-world datasets demonstrate advantages of our algorithm over Gaussian and Laplace mechanisms in supervised classification and maximum likelihood estimation.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.