ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.02789
  4. Cited By
Private Stochastic Convex Optimization with Heavy Tails: Near-Optimality
  from Simple Reductions

Private Stochastic Convex Optimization with Heavy Tails: Near-Optimality from Simple Reductions

4 June 2024
Hilal Asi
Daogao Liu
Kevin Tian
ArXivPDFHTML

Papers citing "Private Stochastic Convex Optimization with Heavy Tails: Near-Optimality from Simple Reductions"

3 / 3 papers shown
Title
Heavy-Tailed Privacy: The Symmetric alpha-Stable Privacy Mechanism
Heavy-Tailed Privacy: The Symmetric alpha-Stable Privacy Mechanism
Christopher Zawacki
Eyad H. Abed
39
1
0
25 Apr 2025
Private Stochastic Optimization With Large Worst-Case Lipschitz
  Parameter: Optimal Rates for (Non-Smooth) Convex Losses and Extension to
  Non-Convex Losses
Private Stochastic Optimization With Large Worst-Case Lipschitz Parameter: Optimal Rates for (Non-Smooth) Convex Losses and Extension to Non-Convex Losses
Andrew Lowy
Meisam Razaviyayn
30
13
0
15 Sep 2022
A simpler approach to obtaining an O(1/t) convergence rate for the
  projected stochastic subgradient method
A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method
Simon Lacoste-Julien
Mark W. Schmidt
Francis R. Bach
126
259
0
10 Dec 2012
1