ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.17772
45
0

An Improved Privacy and Utility Analysis of Differentially Private SGD with Bounded Domain and Smooth Losses

25 February 2025
Hao Liang
W. Zhang
Xinlei He
Kaishun He
Hong Xing
ArXivPDFHTML
Abstract

Differentially Private Stochastic Gradient Descent (DPSGD) is widely used to protect sensitive data during the training of machine learning models, but its privacy guarantees often come at the cost of model performance, largely due to the inherent challenge of accurately quantifying privacy loss. While recent efforts have strengthened privacy guarantees by focusing solely on the final output and bounded domain cases, they still impose restrictive assumptions, such as convexity and other parameter limitations, and often lack a thorough analysis of utility. In this paper, we provide rigorous privacy and utility characterization for DPSGD for smooth loss functions in both bounded and unbounded domains. We track the privacy loss over multiple iterations by exploiting the noisy smooth-reduction property and establish the utility analysis by leveraging the projection's non-expansiveness and clipped SGD properties. In particular, we show that for DPSGD with a bounded domain, (i) the privacy loss can still converge without the convexity assumption, and (ii) a smaller bounded diameter can improve both privacy and utility simultaneously under certain conditions. Numerical results validate our results.

View on arXiv
@article{liang2025_2502.17772,
  title={ An Improved Privacy and Utility Analysis of Differentially Private SGD with Bounded Domain and Smooth Losses },
  author={ Hao Liang and Wanrong Zhang and Xinlei He and Kaishun Wu and Hong Xing },
  journal={arXiv preprint arXiv:2502.17772},
  year={ 2025 }
}
Comments on this paper