Privacy of SGD under Gaussian or Heavy-Tailed Noise: Guarantees without Gradient Clipping

The injection of heavy-tailed noise into the iterates of stochastic gradient descent (SGD) has garnered growing interest in recent years due to its theoretical and empirical benefits for optimization and generalization. However, its implications for privacy preservation remain largely unexplored. Aiming to bridge this gap, we provide differential privacy (DP) guarantees for noisy SGD, when the injected noise follows an -stable distribution, which includes a spectrum of heavy-tailed distributions (with infinite variance) as well as the light-tailed Gaussian distribution. Considering the -DP framework, we show that SGD with heavy-tailed perturbations achieves -DP for a broad class of loss functions which can be non-convex, where is the number of data points. As a remarkable byproduct, contrary to prior work that necessitates bounded sensitivity for the gradients or clipping the iterates, our theory can handle unbounded gradients without clipping, and reveals that under mild assumptions, such a projection step is not actually necessary. Our results suggest that, given other benefits of heavy-tails in optimization, heavy-tailed noising schemes can be a viable alternative to their light-tailed counterparts.
View on arXiv@article{şimşekli2025_2403.02051, title={ Privacy of SGD under Gaussian or Heavy-Tailed Noise: Guarantees without Gradient Clipping }, author={ Umut Şimşekli and Mert Gürbüzbalaban and Sinan Yıldırım and Lingjiong Zhu }, journal={arXiv preprint arXiv:2403.02051}, year={ 2025 } }