ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.02888
  4. Cited By
Bandwidth-based Step-Sizes for Non-Convex Stochastic Optimization

Bandwidth-based Step-Sizes for Non-Convex Stochastic Optimization

5 June 2021
Xiaoyu Wang
M. Johansson
ArXivPDFHTML

Papers citing "Bandwidth-based Step-Sizes for Non-Convex Stochastic Optimization"

6 / 6 papers shown
Title
Statistical Adaptive Stochastic Gradient Methods
Statistical Adaptive Stochastic Gradient Methods
Pengchuan Zhang
Hunter Lang
Qiang Liu
Lin Xiao
ODL
25
11
0
25 Feb 2020
Convergence of a Stochastic Gradient Method with Momentum for Non-Smooth
  Non-Convex Optimization
Convergence of a Stochastic Gradient Method with Momentum for Non-Smooth Non-Convex Optimization
Vien V. Mai
M. Johansson
21
55
0
13 Feb 2020
Understanding the Role of Momentum in Stochastic Gradient Methods
Understanding the Role of Momentum in Stochastic Gradient Methods
Igor Gitman
Hunter Lang
Pengchuan Zhang
Lin Xiao
38
94
0
30 Oct 2019
SGDR: Stochastic Gradient Descent with Warm Restarts
SGDR: Stochastic Gradient Descent with Warm Restarts
I. Loshchilov
Frank Hutter
ODL
170
8,030
0
13 Aug 2016
Cyclical Learning Rates for Training Neural Networks
Cyclical Learning Rates for Training Neural Networks
L. Smith
ODL
65
2,515
0
03 Jun 2015
Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic
  Programming
Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic Programming
Saeed Ghadimi
Guanghui Lan
ODL
32
1,538
0
22 Sep 2013
1