ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1211.2132
  4. Cited By
Accelerated Gradient Methods for Networked Optimization

Accelerated Gradient Methods for Networked Optimization

9 November 2012
E. Ghadimi
Iman Shames
M. Johansson
ArXivPDFHTML

Papers citing "Accelerated Gradient Methods for Networked Optimization"

4 / 4 papers shown
Title
Distributed and time-varying primal-dual dynamics via contraction
  analysis
Distributed and time-varying primal-dual dynamics via contraction analysis
Pedro Cisneros-Velarde
Saber Jafarpour
Francesco Bullo
6
20
0
27 Mar 2020
Accelerated Decentralized Optimization with Local Updates for Smooth and
  Strongly Convex Objectives
Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives
Hadrien Hendrikx
Francis R. Bach
Laurent Massoulié
18
42
0
05 Oct 2018
Momentum and Stochastic Momentum for Stochastic Gradient, Newton,
  Proximal Point and Subspace Descent Methods
Momentum and Stochastic Momentum for Stochastic Gradient, Newton, Proximal Point and Subspace Descent Methods
Nicolas Loizou
Peter Richtárik
19
200
0
27 Dec 2017
An Online Convex Optimization Approach to Dynamic Network Resource
  Allocation
An Online Convex Optimization Approach to Dynamic Network Resource Allocation
Tianyi Chen
Qing Ling
G. Giannakis
24
214
0
14 Jan 2017
1