ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.09925
  4. Cited By
Distributed Fixed Point Methods with Compressed Iterates

Distributed Fixed Point Methods with Compressed Iterates

20 December 2019
Sélim Chraibi
Ahmed Khaled
D. Kovalev
Peter Richtárik
Adil Salim
Martin Takávc
    FedML
ArXivPDFHTML

Papers citing "Distributed Fixed Point Methods with Compressed Iterates"

3 / 3 papers shown
Title
Compressed Distributed Gradient Descent: Communication-Efficient
  Consensus over Networks
Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks
Xin Zhang
Jia Liu
Zhengyuan Zhu
Elizabeth S. Bentley
35
27
0
10 Dec 2018
Cooperative SGD: A unified Framework for the Design and Analysis of
  Communication-Efficient SGD Algorithms
Cooperative SGD: A unified Framework for the Design and Analysis of Communication-Efficient SGD Algorithms
Jianyu Wang
Gauri Joshi
101
348
0
22 Aug 2018
Federated Learning: Strategies for Improving Communication Efficiency
Federated Learning: Strategies for Improving Communication Efficiency
Jakub Konecný
H. B. McMahan
Felix X. Yu
Peter Richtárik
A. Suresh
Dave Bacon
FedML
269
4,620
0
18 Oct 2016
1