ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.08701
  4. Cited By
Robust Distributed Accelerated Stochastic Gradient Methods for
  Multi-Agent Networks

Robust Distributed Accelerated Stochastic Gradient Methods for Multi-Agent Networks

19 October 2019
Alireza Fallah
Mert Gurbuzbalaban
Asuman Ozdaglar
Umut Simsekli
Lingjiong Zhu
ArXivPDFHTML

Papers citing "Robust Distributed Accelerated Stochastic Gradient Methods for Multi-Agent Networks"

5 / 5 papers shown
Title
A cutting-surface consensus approach for distributed robust optimization
  of multi-agent systems
A cutting-surface consensus approach for distributed robust optimization of multi-agent systems
Jun Fu
Xunhao Wu
22
1
0
07 Sep 2023
Uniform-in-Time Wasserstein Stability Bounds for (Noisy) Stochastic
  Gradient Descent
Uniform-in-Time Wasserstein Stability Bounds for (Noisy) Stochastic Gradient Descent
Lingjiong Zhu
Mert Gurbuzbalaban
Anant Raj
Umut Simsekli
34
6
0
20 May 2023
Heavy-Tail Phenomenon in Decentralized SGD
Heavy-Tail Phenomenon in Decentralized SGD
Mert Gurbuzbalaban
Yuanhan Hu
Umut Simsekli
Kun Yuan
Lingjiong Zhu
38
8
0
13 May 2022
Moshpit SGD: Communication-Efficient Decentralized Training on
  Heterogeneous Unreliable Devices
Moshpit SGD: Communication-Efficient Decentralized Training on Heterogeneous Unreliable Devices
Max Ryabinin
Eduard A. Gorbunov
Vsevolod Plokhotnyuk
Gennady Pekhimenko
35
32
0
04 Mar 2021
A Unified Theory of Decentralized SGD with Changing Topology and Local
  Updates
A Unified Theory of Decentralized SGD with Changing Topology and Local Updates
Anastasia Koloskova
Nicolas Loizou
Sadra Boreiri
Martin Jaggi
Sebastian U. Stich
FedML
41
493
0
23 Mar 2020
1