ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.01517
17
101

Convergence rates for the stochastic gradient descent method for non-convex objective functions

2 April 2019
Benjamin J. Fehrman
Benjamin Gess
Arnulf Jentzen
ArXivPDFHTML
Abstract

We prove the local convergence to minima and estimates on the rate of convergence for the stochastic gradient descent method in the case of not necessarily globally convex nor contracting objective functions. In particular, the results are applicable to simple objective functions arising in machine learning.

View on arXiv
Comments on this paper