Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1806.05438
Cited By
Stochastic Gradient Descent with Exponential Convergence Rates of Expected Classification Errors
14 June 2018
Atsushi Nitanda
Taiji Suzuki
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Stochastic Gradient Descent with Exponential Convergence Rates of Expected Classification Errors"
4 / 4 papers shown
Title
Multiclass learning with margin: exponential rates with no bias-variance trade-off
Stefano Vigogna
Giacomo Meanti
Ernesto De Vito
Lorenzo Rosasco
20
2
0
03 Feb 2022
A Scaling Law for Synthetic-to-Real Transfer: How Much Is Your Pre-training Effective?
Hiroaki Mikami
Kenji Fukumizu
Shogo Murai
Shuji Suzuki
Yuta Kikuchi
Taiji Suzuki
S. Maeda
Kohei Hayashi
40
12
0
25 Aug 2021
A Theory of Universal Learning
Olivier Bousquet
Steve Hanneke
Shay Moran
Ramon van Handel
Amir Yehudayoff
19
53
0
09 Nov 2020
A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method
Simon Lacoste-Julien
Mark W. Schmidt
Francis R. Bach
128
259
0
10 Dec 2012
1