ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.12828
  4. Cited By
SGD with Clipping is Secretly Estimating the Median Gradient

SGD with Clipping is Secretly Estimating the Median Gradient

20 February 2024
Fabian Schaipp
Guillaume Garrigos
Umut Simsekli
Robert M. Gower
ArXivPDFHTML

Papers citing "SGD with Clipping is Secretly Estimating the Median Gradient"

4 / 4 papers shown
Title
Convergence Rates of Stochastic Gradient Descent under Infinite Noise
  Variance
Convergence Rates of Stochastic Gradient Descent under Infinite Noise Variance
Hongjian Wang
Mert Gurbuzbalaban
Lingjiong Zhu
Umut cSimcsekli
Murat A. Erdogdu
31
42
0
20 Feb 2021
Learning from History for Byzantine Robust Optimization
Learning from History for Byzantine Robust Optimization
Sai Praneeth Karimireddy
Lie He
Martin Jaggi
FedML
AAML
51
177
0
18 Dec 2020
Uncertainty Principle for Communication Compression in Distributed and
  Federated Learning and the Search for an Optimal Compressor
Uncertainty Principle for Communication Compression in Distributed and Federated Learning and the Search for an Optimal Compressor
M. Safaryan
Egor Shulgin
Peter Richtárik
39
61
0
20 Feb 2020
Mean estimation and regression under heavy-tailed distributions--a
  survey
Mean estimation and regression under heavy-tailed distributions--a survey
Gabor Lugosi
S. Mendelson
73
241
0
10 Jun 2019
1