ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.02182
21
0

Adam-like Algorithm with Smooth Clipping Attains Global Minima: Analysis Based on Ergodicity of Functional SDEs

29 November 2023
Keisuke Suzuki
ArXivPDFHTML
Abstract

In this paper, we prove that an Adam-type algorithm with smooth clipping approaches the global minimizer of the regularized non-convex loss function. Adding smooth clipping and taking the state space as the set of all trajectories, we can apply the ergodic theory of Markov semigroups for this algorithm and investigate its asymptotic behavior. The ergodic theory we establish in this paper reduces the problem of evaluating the convergence, generalization error and discretization error of this algorithm to the problem of evaluating the difference between two functional stochastic differential equations (SDEs) with different drift coefficients. As a result of our analysis, we have shown that this algorithm minimizes the the regularized non-convex loss function with errors of the form n−1/2n^{-1/2}n−1/2, η1/4\eta^{1/4}η1/4, β−1log⁡(β+1)\beta^{-1} \log (\beta + 1)β−1log(β+1) and e−cte^{- c t}e−ct. Here, ccc is a constant and nnn, η\etaη, β\betaβ and ttt denote the size of the training dataset, learning rate, inverse temperature and time, respectively.

View on arXiv
Comments on this paper