Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1712.04104
Cited By
v1
v2
v3 (latest)
Convergence Rates for Deterministic and Stochastic Subgradient Methods Without Lipschitz Continuity
12 December 2017
Benjamin Grimmer
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Convergence Rates for Deterministic and Stochastic Subgradient Methods Without Lipschitz Continuity"
6 / 6 papers shown
Title
Understanding Gradient Orthogonalization for Deep Learning via Non-Euclidean Trust-Region Optimization
Dmitry Kovalev
110
4
0
16 Mar 2025
Some Primal-Dual Theory for Subgradient Methods for Strongly Convex Optimization
Benjamin Grimmer
Danlin Li
82
6
0
31 Dec 2024
Directional Smoothness and Gradient Methods: Convergence and Adaptivity
Aaron Mishkin
Ahmed Khaled
Yuanhao Wang
Aaron Defazio
Robert Mansel Gower
92
9
0
06 Mar 2024
Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems
Damek Davis
Benjamin Grimmer
53
113
0
12 Jul 2017
A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method
Simon Lacoste-Julien
Mark Schmidt
Francis R. Bach
185
260
0
10 Dec 2012
Making Gradient Descent Optimal for Strongly Convex Stochastic Optimization
Alexander Rakhlin
Ohad Shamir
Karthik Sridharan
169
768
0
26 Sep 2011
1