ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.11542
  4. Cited By
Depth Without the Magic: Inductive Bias of Natural Gradient Descent

Depth Without the Magic: Inductive Bias of Natural Gradient Descent

22 November 2021
A. Kerekes
Anna Mészáros
Ferenc Huszár
    ODL
ArXivPDFHTML

Papers citing "Depth Without the Magic: Inductive Bias of Natural Gradient Descent"

4 / 4 papers shown
Title
Regularized Gauss-Newton for Optimizing Overparameterized Neural
  Networks
Regularized Gauss-Newton for Optimizing Overparameterized Neural Networks
Adeyemi Damilare Adeoye
Philipp Christian Petersen
Alberto Bemporad
28
1
0
23 Apr 2024
Noise misleads rotation invariant algorithms on sparse targets
Noise misleads rotation invariant algorithms on sparse targets
Manfred K. Warmuth
Wojciech Kotlowski
Matt Jones
Ehsan Amid
15
0
0
05 Mar 2024
Rethinking Gauss-Newton for learning over-parameterized models
Rethinking Gauss-Newton for learning over-parameterized models
Michael Arbel
Romain Menegaux
Pierre Wolinski
AI4CE
22
5
0
06 Feb 2023
Stochastic Training is Not Necessary for Generalization
Stochastic Training is Not Necessary for Generalization
Jonas Geiping
Micah Goldblum
Phillip E. Pope
Michael Moeller
Tom Goldstein
89
72
0
29 Sep 2021
1