ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.06697
  4. Cited By
Innovation Compression for Communication-efficient Distributed
  Optimization with Linear Convergence

Innovation Compression for Communication-efficient Distributed Optimization with Linear Convergence

14 May 2021
Jiaqi Zhang
Keyou You
Lihua Xie
ArXivPDFHTML

Papers citing "Innovation Compression for Communication-efficient Distributed Optimization with Linear Convergence"

2 / 2 papers shown
Title
CEDAS: A Compressed Decentralized Stochastic Gradient Method with
  Improved Convergence
CEDAS: A Compressed Decentralized Stochastic Gradient Method with Improved Convergence
Kun-Yen Huang
Shin-Yi Pu
38
9
0
14 Jan 2023
Scalable Average Consensus with Compressed Communications
Scalable Average Consensus with Compressed Communications
Taha Toghani
César A. Uribe
22
7
0
14 Sep 2021
1