ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.01363
  4. Cited By
Breaking the Convergence Barrier: Optimization via Fixed-Time Convergent
  Flows

Breaking the Convergence Barrier: Optimization via Fixed-Time Convergent Flows

2 December 2021
Param Budhraja
Mayank Baranwal
Kunal Garg
A. Hota
ArXivPDFHTML

Papers citing "Breaking the Convergence Barrier: Optimization via Fixed-Time Convergent Flows"

5 / 5 papers shown
Title
Accelerated Method for Stochastic Composition Optimization with
  Nonsmooth Regularization
Accelerated Method for Stochastic Composition Optimization with Nonsmooth Regularization
Zhouyuan Huo
Bin Gu
Ji Liu
Heng-Chiao Huang
87
51
0
10 Nov 2017
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark Schmidt
280
1,220
0
16 Aug 2016
A Variational Perspective on Accelerated Methods in Optimization
A Variational Perspective on Accelerated Methods in Optimization
Andre Wibisono
Ashia Wilson
Michael I. Jordan
85
573
0
14 Mar 2016
A Differential Equation for Modeling Nesterov's Accelerated Gradient
  Method: Theory and Insights
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
Weijie Su
Stephen P. Boyd
Emmanuel J. Candes
162
1,168
0
04 Mar 2015
Adam: A Method for Stochastic Optimization
Adam: A Method for Stochastic Optimization
Diederik P. Kingma
Jimmy Ba
ODL
1.8K
150,115
0
22 Dec 2014
1