ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1705.10405
  4. Cited By
Distributed SAGA: Maintaining linear convergence rate with limited
  communication

Distributed SAGA: Maintaining linear convergence rate with limited communication

29 May 2017
Clément Calauzènes
Nicolas Le Roux
ArXiv (abs)PDFHTML

Papers citing "Distributed SAGA: Maintaining linear convergence rate with limited communication"

4 / 4 papers shown
Title
DSAG: A mixed synchronous-asynchronous iterative method for
  straggler-resilient learning
DSAG: A mixed synchronous-asynchronous iterative method for straggler-resilient learning
A. Severinson
E. Rosnes
S. E. Rouayheb
Alexandre Graell i Amat
52
2
0
27 Nov 2021
Federated Variance-Reduced Stochastic Gradient Descent with Robustness
  to Byzantine Attacks
Federated Variance-Reduced Stochastic Gradient Descent with Robustness to Byzantine Attacks
Zhaoxian Wu
Qing Ling
Tianyi Chen
G. Giannakis
FedMLAAML
117
185
0
29 Dec 2019
Distributed Learning with Sparse Communications by Identification
Distributed Learning with Sparse Communications by Identification
Dmitry Grishchenko
F. Iutzeler
J. Malick
Massih-Reza Amini
55
19
0
10 Dec 2018
A Distributed Flexible Delay-tolerant Proximal Gradient Algorithm
A Distributed Flexible Delay-tolerant Proximal Gradient Algorithm
Konstantin Mishchenko
F. Iutzeler
J. Malick
80
22
0
25 Jun 2018
1