ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.02379
  4. Cited By
Asynchrony and Acceleration in Gossip Algorithms
v1v2 (latest)

Asynchrony and Acceleration in Gossip Algorithms

4 November 2020
Mathieu Even
Hadrien Hendrikx
Laurent Massoulié
ArXiv (abs)PDFHTML

Papers citing "Asynchrony and Acceleration in Gossip Algorithms"

12 / 12 papers shown
Title
Advances in Asynchronous Parallel and Distributed Optimization
Advances in Asynchronous Parallel and Distributed Optimization
By Mahmoud Assran
Arda Aytekin
Hamid Reza Feyzmahdavian
M. Johansson
Michael G. Rabbat
75
77
0
24 Jun 2020
Provably Accelerated Randomized Gossip Algorithms
Provably Accelerated Randomized Gossip Algorithms
Nicolas Loizou
Michael G. Rabbat
Peter Richtárik
46
19
0
31 Oct 2018
Accelerated Decentralized Optimization with Local Updates for Smooth and
  Strongly Convex Objectives
Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives
Hadrien Hendrikx
Francis R. Bach
Laurent Massoulié
76
42
0
05 Oct 2018
Accelerated Gossip via Stochastic Heavy Ball Method
Accelerated Gossip via Stochastic Heavy Ball Method
Nicolas Loizou
Peter Richtárik
44
28
0
23 Sep 2018
On Markov Chain Gradient Descent
On Markov Chain Gradient Descent
Tao Sun
Yuejiao Sun
W. Yin
BDL
46
102
0
12 Sep 2018
A Dual Approach for Optimal Algorithms in Distributed Optimization over
  Networks
A Dual Approach for Optimal Algorithms in Distributed Optimization over Networks
César A. Uribe
Soomin Lee
Alexander Gasnikov
A. Nedić
57
137
0
03 Sep 2018
D$^2$: Decentralized Training over Decentralized Data
D2^22: Decentralized Training over Decentralized Data
Hanlin Tang
Xiangru Lian
Ming Yan
Ce Zhang
Ji Liu
42
352
0
19 Mar 2018
Can Decentralized Algorithms Outperform Centralized Algorithms? A Case
  Study for Decentralized Parallel Stochastic Gradient Descent
Can Decentralized Algorithms Outperform Centralized Algorithms? A Case Study for Decentralized Parallel Stochastic Gradient Descent
Xiangru Lian
Ce Zhang
Huan Zhang
Cho-Jui Hsieh
Wei Zhang
Ji Liu
60
1,235
0
25 May 2017
Optimal algorithms for smooth and strongly convex distributed
  optimization in networks
Optimal algorithms for smooth and strongly convex distributed optimization in networks
Kevin Scaman
Francis R. Bach
Sébastien Bubeck
Y. Lee
Laurent Massoulié
79
331
0
28 Feb 2017
Asynchronous Stochastic Gradient Descent with Delay Compensation
Asynchronous Stochastic Gradient Descent with Delay Compensation
Shuxin Zheng
Qi Meng
Taifeng Wang
Wei Chen
Nenghai Yu
Zhiming Ma
Tie-Yan Liu
115
315
0
27 Sep 2016
Chebyshev Polynomials in Distributed Consensus Applications
Chebyshev Polynomials in Distributed Consensus Applications
Eduardo Montijano
J. I. Montijano
C. Sagüés
73
45
0
21 Nov 2011
HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient
  Descent
HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent
Feng Niu
Benjamin Recht
Christopher Ré
Stephen J. Wright
201
2,273
0
28 Jun 2011
1