ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.06271
  4. Cited By
Adaptive proximal gradient methods are universal without approximation

Adaptive proximal gradient methods are universal without approximation

9 February 2024
Konstantinos A. Oikonomidis
Emanuel Laude
P. Latafat
Andreas Themelis
Panagiotis Patrinos
ArXivPDFHTML

Papers citing "Adaptive proximal gradient methods are universal without approximation"

3 / 3 papers shown
Title
Adaptive Extrapolated Proximal Gradient Methods with Variance Reduction for Composite Nonconvex Finite-Sum Minimization
Adaptive Extrapolated Proximal Gradient Methods with Variance Reduction for Composite Nonconvex Finite-Sum Minimization
Ganzhao Yuan
43
0
0
28 Feb 2025
The inexact power augmented Lagrangian method for constrained nonconvex
  optimization
The inexact power augmented Lagrangian method for constrained nonconvex optimization
Alexander Bodard
Konstantinos A. Oikonomidis
Emanuel Laude
Panagiotis Patrinos
26
0
0
26 Oct 2024
Safeguarding adaptive methods: global convergence of Barzilai-Borwein
  and other stepsize choices
Safeguarding adaptive methods: global convergence of Barzilai-Borwein and other stepsize choices
Hongjia Ou
Andreas Themelis
32
0
0
15 Apr 2024
1