ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.11343
76
0
v1v2 (latest)

Revisiting Stochastic Approximation and Stochastic Gradient Descent

16 May 2025
Rajeeva Laxman Karandikar
Bhamidi Visweswara Rao
Mathukumalli Vidyasagar
ArXiv (abs)PDFHTML
Main:27 Pages
Bibliography:3 Pages
Abstract

In this paper, we take a fresh look at stochastic approximation (SA) and Stochastic Gradient Descent (SGD). We derive new sufficient conditions for the convergence of SA. In particular, the "noise" or measurement error need not have a finite second moment, and under suitable conditions, not even a finite mean. By adapting this method of proof, we also derive sufficient conditions for the convergence of zero-order SGD, wherein the stochastic gradient is computed using only two function evaluations, and no gradient computations. The sufficient conditions derived here are the weakest to date, thus leading to a considerable expansion of the applicability of SA and SGD theory.

View on arXiv
@article{karandikar2025_2505.11343,
  title={ Revisiting Stochastic Approximation and Stochastic Gradient Descent },
  author={ Rajeeva Laxman Karandikar and Bhamidi Visweswara Rao and Mathukumalli Vidyasagar },
  journal={arXiv preprint arXiv:2505.11343},
  year={ 2025 }
}
Comments on this paper