ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.12058
  4. Cited By
Convergence in quadratic mean of averaged stochastic gradient algorithms
  without strong convexity nor bounded gradient

Convergence in quadratic mean of averaged stochastic gradient algorithms without strong convexity nor bounded gradient

26 July 2021
Antoine Godichon-Baggioni
ArXivPDFHTML

Papers citing "Convergence in quadratic mean of averaged stochastic gradient algorithms without strong convexity nor bounded gradient"

4 / 4 papers shown
Title
Online estimation of the inverse of the Hessian for stochastic optimization with application to universal stochastic Newton algorithms
Online estimation of the inverse of the Hessian for stochastic optimization with application to universal stochastic Newton algorithms
Antoine Godichon-Baggioni
Wei Lu
Bruno Portier
44
1
0
15 Jan 2024
Non asymptotic analysis of Adaptive stochastic gradient algorithms and
  applications
Non asymptotic analysis of Adaptive stochastic gradient algorithms and applications
Antoine Godichon-Baggioni
Pierre Tarrago
27
5
0
01 Mar 2023
Non-Asymptotic Analysis of Stochastic Approximation Algorithms for
  Streaming Data
Non-Asymptotic Analysis of Stochastic Approximation Algorithms for Streaming Data
Antoine Godichon-Baggioni
Nicklas Werge
Olivier Wintenberger
25
7
0
15 Sep 2021
A Simple Convergence Proof of Adam and Adagrad
A Simple Convergence Proof of Adam and Adagrad
Alexandre Défossez
Léon Bottou
Francis R. Bach
Nicolas Usunier
56
146
0
05 Mar 2020
1