ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.11985
  4. Cited By
Adam$^+$: A Stochastic Method with Adaptive Variance Reduction

Adam+^++: A Stochastic Method with Adaptive Variance Reduction

24 November 2020
Mingrui Liu
Wei Zhang
Francesco Orabona
Tianbao Yang
ArXivPDFHTML

Papers citing "Adam$^+$: A Stochastic Method with Adaptive Variance Reduction"

5 / 5 papers shown
Title
Gradient-Free Method for Heavily Constrained Nonconvex Optimization
Gradient-Free Method for Heavily Constrained Nonconvex Optimization
Wanli Shi
Hongchang Gao
Bin Gu
23
5
0
31 Aug 2024
PanBench: Towards High-Resolution and High-Performance Pansharpening
PanBench: Towards High-Resolution and High-Performance Pansharpening
Shiying Wang
Xuechao Zou
Kai Li
Junliang Xing
Pin Tao
37
1
0
20 Nov 2023
A Novel Convergence Analysis for Algorithms of the Adam Family
A Novel Convergence Analysis for Algorithms of the Adam Family
Zhishuai Guo
Yi Tian Xu
W. Yin
Rong Jin
Tianbao Yang
39
48
0
07 Dec 2021
SVRG Meets AdaGrad: Painless Variance Reduction
SVRG Meets AdaGrad: Painless Variance Reduction
Benjamin Dubois-Taine
Sharan Vaswani
Reza Babanezhad
Mark Schmidt
Simon Lacoste-Julien
23
18
0
18 Feb 2021
Descending through a Crowded Valley - Benchmarking Deep Learning
  Optimizers
Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers
Robin M. Schmidt
Frank Schneider
Philipp Hennig
ODL
47
162
0
03 Jul 2020
1