ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.07243
  4. Cited By
Compressed Gradient Tracking for Decentralized Optimization Over General
  Directed Networks

Compressed Gradient Tracking for Decentralized Optimization Over General Directed Networks

14 June 2021
Zhuoqing Song
Lei Shi
Shi Pu
Ming Yan
ArXivPDFHTML

Papers citing "Compressed Gradient Tracking for Decentralized Optimization Over General Directed Networks"

4 / 4 papers shown
Title
CEDAS: A Compressed Decentralized Stochastic Gradient Method with
  Improved Convergence
CEDAS: A Compressed Decentralized Stochastic Gradient Method with Improved Convergence
Kun-Yen Huang
Shin-Yi Pu
43
9
0
14 Jan 2023
Scalable Average Consensus with Compressed Communications
Scalable Average Consensus with Compressed Communications
Taha Toghani
César A. Uribe
22
7
0
14 Sep 2021
Decentralized Composite Optimization with Compression
Decentralized Composite Optimization with Compression
Yao Li
Xiaorui Liu
Jiliang Tang
Ming Yan
Kun Yuan
32
9
0
10 Aug 2021
Communication-efficient Distributed Cooperative Learning with Compressed
  Beliefs
Communication-efficient Distributed Cooperative Learning with Compressed Beliefs
Taha Toghani
César A. Uribe
27
15
0
14 Feb 2021
1