ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.01248
21
0

Modified Step Size for Enhanced Stochastic Gradient Descent: Convergence and Experiments

3 September 2023
M. S. Shamaee
S. F. Hafshejani
ArXiv (abs)PDFHTMLGithub
Abstract

This paper introduces a novel approach to enhance the performance of the stochastic gradient descent (SGD) algorithm by incorporating a modified decay step size based on 1t\frac{1}{\sqrt{t}}t​1​. The proposed step size integrates a logarithmic term, leading to the selection of smaller values in the final iterations. Our analysis establishes a convergence rate of O(ln⁡TT)O(\frac{\ln T}{\sqrt{T}})O(T​lnT​) for smooth non-convex functions without the Polyak-{\L}ojasiewicz condition. To evaluate the effectiveness of our approach, we conducted numerical experiments on image classification tasks using the FashionMNIST, and CIFAR10 datasets, and the results demonstrate significant improvements in accuracy, with enhancements of 0.5%0.5\%0.5% and 1.4%1.4\%1.4% observed, respectively, compared to the traditional 1t\frac{1}{\sqrt{t}}t​1​ step size. The source code can be found at \\\url{https://github.com/Shamaeem/LNSQRTStepSize}.

View on arXiv
Comments on this paper