ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.03938
  4. Cited By
Adam-family Methods for Nonsmooth Optimization with Convergence
  Guarantees

Adam-family Methods for Nonsmooth Optimization with Convergence Guarantees

6 May 2023
Nachuan Xiao
Xiaoyin Hu
Xin Liu
Kim-Chuan Toh
ArXivPDFHTML

Papers citing "Adam-family Methods for Nonsmooth Optimization with Convergence Guarantees"

3 / 3 papers shown
Title
Developing Lagrangian-based Methods for Nonsmooth Nonconvex Optimization
Developing Lagrangian-based Methods for Nonsmooth Nonconvex Optimization
Nachuan Xiao
Kuang-Yu Ding
Xiaoyin Hu
Kim-Chuan Toh
24
2
0
15 Apr 2024
Convergence of Decentralized Stochastic Subgradient-based Methods for Nonsmooth Nonconvex functions
Convergence of Decentralized Stochastic Subgradient-based Methods for Nonsmooth Nonconvex functions
Siyuan Zhang
Nachuan Xiao
Xin Liu
61
1
0
18 Mar 2024
Towards Practical Adam: Non-Convexity, Convergence Theory, and
  Mini-Batch Acceleration
Towards Practical Adam: Non-Convexity, Convergence Theory, and Mini-Batch Acceleration
Congliang Chen
Li Shen
Fangyu Zou
Wei Liu
38
29
0
14 Jan 2021
1