ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.05438
  4. Cited By
Stochastic Gradient Descent with Exponential Convergence Rates of
  Expected Classification Errors

Stochastic Gradient Descent with Exponential Convergence Rates of Expected Classification Errors

14 June 2018
Atsushi Nitanda
Taiji Suzuki
ArXivPDFHTML

Papers citing "Stochastic Gradient Descent with Exponential Convergence Rates of Expected Classification Errors"

4 / 4 papers shown
Title
Multiclass learning with margin: exponential rates with no bias-variance
  trade-off
Multiclass learning with margin: exponential rates with no bias-variance trade-off
Stefano Vigogna
Giacomo Meanti
Ernesto De Vito
Lorenzo Rosasco
20
2
0
03 Feb 2022
A Scaling Law for Synthetic-to-Real Transfer: How Much Is Your
  Pre-training Effective?
A Scaling Law for Synthetic-to-Real Transfer: How Much Is Your Pre-training Effective?
Hiroaki Mikami
Kenji Fukumizu
Shogo Murai
Shuji Suzuki
Yuta Kikuchi
Taiji Suzuki
S. Maeda
Kohei Hayashi
40
12
0
25 Aug 2021
A Theory of Universal Learning
A Theory of Universal Learning
Olivier Bousquet
Steve Hanneke
Shay Moran
Ramon van Handel
Amir Yehudayoff
19
53
0
09 Nov 2020
A simpler approach to obtaining an O(1/t) convergence rate for the
  projected stochastic subgradient method
A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method
Simon Lacoste-Julien
Mark W. Schmidt
Francis R. Bach
128
259
0
10 Dec 2012
1