ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.05466
  4. Cited By
Convergence of a Stochastic Gradient Method with Momentum for Non-Smooth
  Non-Convex Optimization

Convergence of a Stochastic Gradient Method with Momentum for Non-Smooth Non-Convex Optimization

13 February 2020
Vien V. Mai
M. Johansson
ArXivPDFHTML

Papers citing "Convergence of a Stochastic Gradient Method with Momentum for Non-Smooth Non-Convex Optimization"

5 / 5 papers shown
Title
HOME-3: High-Order Momentum Estimator with Third-Power Gradient for Convex and Smooth Nonconvex Optimization
HOME-3: High-Order Momentum Estimator with Third-Power Gradient for Convex and Smooth Nonconvex Optimization
Wei Zhang
Arif Hassan Zidan
Afrar Jahin
Wei Zhang
Tianming Liu
ODL
42
0
0
16 May 2025
On the Performance Analysis of Momentum Method: A Frequency Domain Perspective
On the Performance Analysis of Momentum Method: A Frequency Domain Perspective
Xianliang Li
Jun Luo
Zhiwei Zheng
Hanxiao Wang
Li Luo
Lingkun Wen
Linlong Wu
Sheng Xu
105
0
0
29 Nov 2024
Almost sure convergence rates of stochastic gradient methods under gradient domination
Almost sure convergence rates of stochastic gradient methods under gradient domination
Simon Weissmann
Sara Klein
Waïss Azizian
Leif Döring
58
3
0
22 May 2024
Understanding the Role of Momentum in Stochastic Gradient Methods
Understanding the Role of Momentum in Stochastic Gradient Methods
Igor Gitman
Hunter Lang
Pengchuan Zhang
Lin Xiao
45
95
0
30 Oct 2019
Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic
  Programming
Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic Programming
Saeed Ghadimi
Guanghui Lan
ODL
71
1,538
0
22 Sep 2013
1