ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.11693
  4. Cited By
Amos: An Adam-style Optimizer with Adaptive Weight Decay towards
  Model-Oriented Scale

Amos: An Adam-style Optimizer with Adaptive Weight Decay towards Model-Oriented Scale

21 October 2022
Ran Tian
Ankur P. Parikh
    ODL
ArXivPDFHTML

Papers citing "Amos: An Adam-style Optimizer with Adaptive Weight Decay towards Model-Oriented Scale"

3 / 3 papers shown
Title
Poisson Process for Bayesian Optimization
Poisson Process for Bayesian Optimization
Xiaoxing Wang
Jiaxing Li
Chao Xue
Wei Liu
Weifeng Liu
Xiaokang Yang
Junchi Yan
Dacheng Tao
25
1
0
05 Feb 2024
A Theoretical and Empirical Study on the Convergence of Adam with an
  "Exact" Constant Step Size in Non-Convex Settings
A Theoretical and Empirical Study on the Convergence of Adam with an "Exact" Constant Step Size in Non-Convex Settings
Alokendu Mazumder
Rishabh Sabharwal
Manan Tayal
Bhartendu Kumar
Punit Rathore
17
0
0
15 Sep 2023
Scaling Laws for Neural Language Models
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
261
4,489
0
23 Jan 2020
1