Assessment of disclosure risk is of paramount importance in the research and applications of data privacy techniques. The concept of differential privacy (DP) formalizes privacy in probabilistic terms and provides a robust concept for privacy protection without making assumptions about the background knowledge of adversaries. Practical applications of DP involve development of DP mechanisms to release results at a pre-specified privacy budget. In this paper, we generalize the widely used Laplace mechanism to the family of generalized Gaussian (GG) mechanism based on the global sensitivity of statistical queries. We explore the theoretical requirement for the GG mechanism to reach DP at prespecified privacy parameters, and investigate the connections and differences between the GG mechanism and the Exponential mechanism based on the GG distribution We also present a lower bound on the scale parameter of the Gaussian mechanism of -probabilistic DP as a special case of the GG mechanism, and compare the statistical utility of the sanitized results in the tail probability and dispersion in the Gaussian and Laplace mechanisms. Lastly, we apply the GG mechanism in 3 experiments (the mildew, Czech, adult data), and compare the accuracy of sanitized results via the distance and Kullback-Leibler divergence and examine how sanitization affects the prediction power of a classifier constructed with the sanitized data in the adult experiment.
View on arXiv