Dirichlet Mechanism for Differentially Private KL Divergence Minimization

Given an empirical distribution of sensitive data , we consider the task of minimizing over a probability simplex, while protecting the privacy of . We observe that, if we take the exponential mechanism and use the KL divergence as the loss function, then the resulting algorithm is the Dirichlet mechanism that outputs a single draw from a Dirichlet distribution. Motivated by this, we propose a R\ényi differentially private (RDP) algorithm that employs the Dirichlet mechanism to solve the KL divergence minimization task. In addition, given as above and an output of the Dirichlet mechanism, we prove a probability tail bound on , which is then used to derive a lower bound for the sample complexity of our RDP algorithm. Experiments on real-world datasets demonstrate advantages of our algorithm over Gaussian and Laplace mechanisms in supervised classification and maximum likelihood estimation.
View on arXiv