Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2402.15344
Cited By
Iteration and Stochastic First-order Oracle Complexities of Stochastic Gradient Descent using Constant and Decaying Learning Rates
23 February 2024
Kento Imaizumi
Hideaki Iiduka
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Iteration and Stochastic First-order Oracle Complexities of Stochastic Gradient Descent using Constant and Decaying Learning Rates"
8 / 8 papers shown
Title
Better Theory for SGD in the Nonconvex World
Ahmed Khaled
Peter Richtárik
45
182
0
09 Feb 2020
Painless Stochastic Gradient: Interpolation, Line-Search, and Convergence Rates
Sharan Vaswani
Aaron Mishkin
I. Laradji
Mark Schmidt
Gauthier Gidel
Simon Lacoste-Julien
ODL
72
208
0
24 May 2019
Convergence rates for the stochastic gradient descent method for non-convex objective functions
Benjamin J. Fehrman
Benjamin Gess
Arnulf Jentzen
61
101
0
02 Apr 2019
Measuring the Effects of Data Parallelism on Neural Network Training
Christopher J. Shallue
Jaehoon Lee
J. Antognini
J. Mamou
J. Ketterling
Yao Wang
78
408
0
08 Nov 2018
Lipschitz regularity of deep neural networks: analysis and efficient estimation
Kevin Scaman
Aladin Virmaux
64
523
0
28 May 2018
Don't Decay the Learning Rate, Increase the Batch Size
Samuel L. Smith
Pieter-Jan Kindermans
Chris Ying
Quoc V. Le
ODL
95
990
0
01 Nov 2017
Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic Programming
Saeed Ghadimi
Guanghui Lan
ODL
100
1,538
0
22 Sep 2013
Better Mini-Batch Algorithms via Accelerated Gradient Methods
Andrew Cotter
Ohad Shamir
Nathan Srebro
Karthik Sridharan
ODL
105
313
0
22 Jun 2011
1