ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1712.04755
  4. Cited By
Exponential convergence of testing error for stochastic gradient methods

Exponential convergence of testing error for stochastic gradient methods

13 December 2017
Loucas Pillaud-Vivien
Alessandro Rudi
Francis R. Bach
ArXivPDFHTML

Papers citing "Exponential convergence of testing error for stochastic gradient methods"

3 / 3 papers shown
Title
On the Benefits of Large Learning Rates for Kernel Methods
On the Benefits of Large Learning Rates for Kernel Methods
Gaspard Beugnot
Julien Mairal
Alessandro Rudi
11
11
0
28 Feb 2022
Multiclass learning with margin: exponential rates with no bias-variance
  trade-off
Multiclass learning with margin: exponential rates with no bias-variance trade-off
S. Vigogna
Giacomo Meanti
E. De Vito
Lorenzo Rosasco
20
2
0
03 Feb 2022
Implicit Bias of Gradient Descent for Wide Two-layer Neural Networks
  Trained with the Logistic Loss
Implicit Bias of Gradient Descent for Wide Two-layer Neural Networks Trained with the Logistic Loss
Lénaïc Chizat
Francis R. Bach
MLT
16
327
0
11 Feb 2020
1