ResearchTrend.AI
  • Papers
  • Communities
  • Organizations
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.00145
  4. Cited By
Step-size Adaptation Using Exponentiated Gradient Updates

Step-size Adaptation Using Exponentiated Gradient Updates

31 January 2022
Ehsan Amid
Rohan Anil
Christopher Fifty
Manfred K. Warmuth
ArXiv (abs)PDFHTML

Papers citing "Step-size Adaptation Using Exponentiated Gradient Updates"

6 / 6 papers shown
Title
MetaOptimize: A Framework for Optimizing Step Sizes and Other
  Meta-parameters
MetaOptimize: A Framework for Optimizing Step Sizes and Other Meta-parameters
Arsalan Sharifnassab
Saber Salehkaleybar
Richard Sutton
113
3
0
04 Feb 2024
An Automatic Learning Rate Schedule Algorithm for Achieving Faster
  Convergence and Steeper Descent
An Automatic Learning Rate Schedule Algorithm for Achieving Faster Convergence and Steeper Descent
Zhao Song
Chiwun Yang
92
10
0
17 Oct 2023
Searching for Optimal Per-Coordinate Step-sizes with Multidimensional
  Backtracking
Searching for Optimal Per-Coordinate Step-sizes with Multidimensional Backtracking
Frederik Kunstner
V. S. Portella
Mark Schmidt
Nick Harvey
108
10
0
05 Jun 2023
Federated Hypergradient Descent
Federated Hypergradient Descent
A. K. Kan
FedML
87
3
0
03 Nov 2022
Learning to Optimize Quasi-Newton Methods
Learning to Optimize Quasi-Newton Methods
Isaac Liao
Rumen Dangovski
Jakob N. Foerster
Marin Soljacic
96
4
0
11 Oct 2022
Large-Scale Differentially Private BERT
Large-Scale Differentially Private BERT
Rohan Anil
Badih Ghazi
Vineet Gupta
Ravi Kumar
Pasin Manurangsi
96
139
0
03 Aug 2021
1