ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.05857
  4. Cited By
Improving the Sample and Communication Complexity for Decentralized
  Non-Convex Optimization: A Joint Gradient Estimation and Tracking Approach

Improving the Sample and Communication Complexity for Decentralized Non-Convex Optimization: A Joint Gradient Estimation and Tracking Approach

13 October 2019
Haoran Sun
Songtao Lu
Mingyi Hong
ArXivPDFHTML

Papers citing "Improving the Sample and Communication Complexity for Decentralized Non-Convex Optimization: A Joint Gradient Estimation and Tracking Approach"

4 / 4 papers shown
Title
Achieving Linear Speedup in Decentralized Stochastic Compositional
  Minimax Optimization
Achieving Linear Speedup in Decentralized Stochastic Compositional Minimax Optimization
Hongchang Gao
40
1
0
25 Jul 2023
INTERACT: Achieving Low Sample and Communication Complexities in
  Decentralized Bilevel Learning over Networks
INTERACT: Achieving Low Sample and Communication Complexities in Decentralized Bilevel Learning over Networks
Zhuqing Liu
Xin Zhang
Prashant Khanduri
Songtao Lu
Jia Liu
35
11
0
27 Jul 2022
Variance-Reduced Stochastic Quasi-Newton Methods for Decentralized
  Learning: Part I
Variance-Reduced Stochastic Quasi-Newton Methods for Decentralized Learning: Part I
Jiaojiao Zhang
Huikang Liu
Anthony Man-Cho So
Qing Ling
24
14
0
19 Jan 2022
MARINA: Faster Non-Convex Distributed Learning with Compression
MARINA: Faster Non-Convex Distributed Learning with Compression
Eduard A. Gorbunov
Konstantin Burlachenko
Zhize Li
Peter Richtárik
39
109
0
15 Feb 2021
1