ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1908.02747
  4. Cited By
Distributed Gradient Descent: Nonconvergence to Saddle Points and the
  Stable-Manifold Theorem

Distributed Gradient Descent: Nonconvergence to Saddle Points and the Stable-Manifold Theorem

7 August 2019
Brian Swenson
Ryan W. Murray
H. Vincent Poor
S. Kar
ArXivPDFHTML

Papers citing "Distributed Gradient Descent: Nonconvergence to Saddle Points and the Stable-Manifold Theorem"

2 / 2 papers shown
Title
Understanding A Class of Decentralized and Federated Optimization
  Algorithms: A Multi-Rate Feedback Control Perspective
Understanding A Class of Decentralized and Federated Optimization Algorithms: A Multi-Rate Feedback Control Perspective
Xinwei Zhang
Mingyi Hong
N. Elia
FedML
18
3
0
27 Apr 2022
Distributed Gradient Flow: Nonsmoothness, Nonconvexity, and Saddle Point
  Evasion
Distributed Gradient Flow: Nonsmoothness, Nonconvexity, and Saddle Point Evasion
Brian Swenson
Ryan W. Murray
H. Vincent Poor
S. Kar
12
16
0
12 Aug 2020
1