ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.13748
  4. Cited By
Compressed Gradient Tracking Methods for Decentralized Optimization with
  Linear Convergence

Compressed Gradient Tracking Methods for Decentralized Optimization with Linear Convergence

25 March 2021
Yiwei Liao
Zhuoru Li
Kun-Yen Huang
Shi Pu
ArXivPDFHTML

Papers citing "Compressed Gradient Tracking Methods for Decentralized Optimization with Linear Convergence"

3 / 3 papers shown
Title
CEDAS: A Compressed Decentralized Stochastic Gradient Method with
  Improved Convergence
CEDAS: A Compressed Decentralized Stochastic Gradient Method with Improved Convergence
Kun-Yen Huang
Shin-Yi Pu
43
9
0
14 Jan 2023
BEER: Fast $O(1/T)$ Rate for Decentralized Nonconvex Optimization with
  Communication Compression
BEER: Fast O(1/T)O(1/T)O(1/T) Rate for Decentralized Nonconvex Optimization with Communication Compression
Haoyu Zhao
Boyue Li
Zhize Li
Peter Richtárik
Yuejie Chi
37
49
0
31 Jan 2022
Linearly Converging Error Compensated SGD
Linearly Converging Error Compensated SGD
Eduard A. Gorbunov
D. Kovalev
Dmitry Makarenko
Peter Richtárik
163
78
0
23 Oct 2020
1