ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.03354
19
0

Gˉmst\bar{G}_{mst}Gˉmst​:An Unbiased Stratified Statistic and a Fast Gradient Optimization Algorithm Based on It

7 October 2021
Aixiang Chen
ArXivPDFHTML
Abstract

-The fluctuation effect of gradient expectation and variance caused by parameter update between consecutive iterations is neglected or confusing by current mainstream gradient optimization algorithms. The work in this paper remedy this issue by introducing a novel unbiased stratified statistic \ Gˉmst\bar{G}_{mst}Gˉmst​\ , a sufficient condition of fast convergence for \ Gˉmst\bar{G}_{mst}Gˉmst​\ also is established. A novel algorithm named MSSG designed based on \ Gˉmst\bar{G}_{mst}Gˉmst​\ outperforms other sgd-like algorithms. Theoretical conclusions and experimental evidence strongly suggest to employ MSSG when training deep model.

View on arXiv
Comments on this paper