427

Statistical Properties of Sanitized Results from Differentially Private Laplace Mechanisms with Noninformative Bounding

Abstract

Protection of individual privacy is a common concern when releasing and sharing data and information. Differential privacy (DP) formalizes privacy in probabilistic terms without making assumptions about the background knowledge of data intruders, and thus provides a robust concept for privacy protection. Practical applications of DP involve development of differentially private mechanisms to generate sanitized results at a pre-specified privacy budget. In the sanitization of bounded statistics such as proportions and correlation coefficients, the bounding constraints will need to be incorporated in the differentially private mechanisms. There has been little work in examining the consequences of the incorporation of of bounding constraints on the accuracy of sanitized results and the statistical inferences based on the sanitized results from a differentially private mechanism. In this paper, we define noninformative bounding procedures and formalize the differentially private truncated and boundary inflated truncated (BIT) mechanisms for releasing statistics with bounding constraints. The impacts of the noninformative truncated and BIT Laplace mechanisms on the statistical accuracy and utility of sanitized statistics, including bias, asymptotic consistency, and mean squared error, are evaluated both theoretically and empirically via simulation studies. We also provided an upper bound for the mean squared error between the sanitized and original results for definite nn in the truncated Laplace and BIT Laplace mechanism; the bound goes to 0 if the scale parameter goes to 0 as nn\rightarrow\infty.

View on arXiv
Comments on this paper