ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.08925
  4. Cited By
Differentially Private SGD with Non-Smooth Losses

Differentially Private SGD with Non-Smooth Losses

22 January 2021
Puyu Wang
Yunwen Lei
Yiming Ying
Hai Zhang
ArXivPDFHTML

Papers citing "Differentially Private SGD with Non-Smooth Losses"

9 / 9 papers shown
Title
Stability and Generalization for Markov Chain Stochastic Gradient
  Methods
Stability and Generalization for Markov Chain Stochastic Gradient Methods
Puyu Wang
Yunwen Lei
Yiming Ying
Ding-Xuan Zhou
24
18
0
16 Sep 2022
Differentially Private Stochastic Gradient Descent with Low-Noise
Differentially Private Stochastic Gradient Descent with Low-Noise
Puyu Wang
Yunwen Lei
Yiming Ying
Ding-Xuan Zhou
FedML
49
5
0
09 Sep 2022
Sharper Utility Bounds for Differentially Private Models
Sharper Utility Bounds for Differentially Private Models
Yilin Kang
Yong Liu
Jian Li
Weiping Wang
FedML
35
3
0
22 Apr 2022
Differentially Private SGDA for Minimax Problems
Differentially Private SGDA for Minimax Problems
Zhenhuan Yang
Shu Hu
Yunwen Lei
Kush R. Varshney
Siwei Lyu
Yiming Ying
36
19
0
22 Jan 2022
Differentially Private Coordinate Descent for Composite Empirical Risk
  Minimization
Differentially Private Coordinate Descent for Composite Empirical Risk Minimization
Paul Mangold
A. Bellet
Joseph Salmon
Marc Tommasi
32
14
0
22 Oct 2021
Stability and Generalization for Randomized Coordinate Descent
Stability and Generalization for Randomized Coordinate Descent
Puyu Wang
Liang Wu
Yunwen Lei
27
7
0
17 Aug 2021
Improved Learning Rates for Stochastic Optimization: Two Theoretical
  Viewpoints
Improved Learning Rates for Stochastic Optimization: Two Theoretical Viewpoints
Shaojie Li
Yong Liu
26
13
0
19 Jul 2021
A simpler approach to obtaining an O(1/t) convergence rate for the
  projected stochastic subgradient method
A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method
Simon Lacoste-Julien
Mark W. Schmidt
Francis R. Bach
128
259
0
10 Dec 2012
Stochastic Gradient Descent for Non-smooth Optimization: Convergence
  Results and Optimal Averaging Schemes
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
104
571
0
08 Dec 2012
1