31
0

Contraction of EγE_γ-Divergence and Its Applications to Privacy

Abstract

We investigate the contraction coefficients derived from strong data processing inequalities for the EγE_\gamma-divergence. By generalizing the celebrated Dobrushin's coefficient from total variation distance to EγE_\gamma-divergence, we derive a closed-form expression for the contraction of EγE_\gamma-divergence. This result has fundamental consequences in two privacy settings. First, it implies that local differential privacy can be equivalently expressed in terms of the contraction of EγE_\gamma-divergence. This equivalent formula can be used to precisely quantify the impact of local privacy in (Bayesian and minimax) estimation and hypothesis testing problems in terms of the reduction of effective sample size. Second, it leads to a new information-theoretic technique for analyzing privacy guarantees of online algorithms. In this technique, we view such algorithms as a composition of amplitude-constrained Gaussian channels and then relate their contraction coefficients under EγE_\gamma-divergence to the overall differential privacy guarantees. As an example, we apply our technique to derive the differential privacy parameters of gradient descent. Moreover, we also show that this framework can be tailored to batch learning algorithms that can be implemented with one pass over the training dataset.

View on arXiv
Comments on this paper