ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1709.01427
  4. Cited By
Stochastic Gradient Descent: Going As Fast As Possible But Not Faster

Stochastic Gradient Descent: Going As Fast As Possible But Not Faster

5 September 2017
Alice Schoenauer Sebag
Marc Schoenauer
Michèle Sebag
ArXivPDFHTML

Papers citing "Stochastic Gradient Descent: Going As Fast As Possible But Not Faster"

1 / 1 papers shown
Title
Flexible numerical optimization with ensmallen
Flexible numerical optimization with ensmallen
Ryan R. Curtin
Marcus Edel
Rahul Prabhu
S. Basak
Zhihao Lou
Conrad Sanderson
18
1
0
09 Mar 2020
1