21
10

Robust Estimation of Discrete Distributions under Local Differential Privacy

Abstract

Although robust learning and local differential privacy are both widely studied fields of research, combining the two settings is just starting to be explored. We consider the problem of estimating a discrete distribution in total variation from nn contaminated data batches under a local differential privacy constraint. A fraction 1ϵ1-\epsilon of the batches contain kk i.i.d. samples drawn from a discrete distribution pp over dd elements. To protect the users' privacy, each of the samples is privatized using an α\alpha-locally differentially private mechanism. The remaining ϵn\epsilon n batches are an adversarial contamination. The minimax rate of estimation under contamination alone, with no privacy, is known to be ϵ/k+d/kn\epsilon/\sqrt{k}+\sqrt{d/kn}, up to a log(1/ϵ)\sqrt{\log(1/\epsilon)} factor. Under the privacy constraint alone, the minimax rate of estimation is d2/α2kn\sqrt{d^2/\alpha^2 kn}. We show that combining the two constraints leads to a minimax estimation rate of ϵd/α2k+d2/α2kn\epsilon\sqrt{d/\alpha^2 k}+\sqrt{d^2/\alpha^2 kn} up to a log(1/ϵ)\sqrt{\log(1/\epsilon)} factor, larger than the sum of the two separate rates. We provide a polynomial-time algorithm achieving this bound, as well as a matching information theoretic lower bound.

View on arXiv
Comments on this paper