ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.17805
  4. Cited By
On the Effect of Initialization: The Scaling Path of 2-Layer Neural
  Networks

On the Effect of Initialization: The Scaling Path of 2-Layer Neural Networks

31 March 2023
Sebastian Neumayer
Lénaïc Chizat
M. Unser
ArXivPDFHTML

Papers citing "On the Effect of Initialization: The Scaling Path of 2-Layer Neural Networks"

2 / 2 papers shown
Title
Large Learning Rate Tames Homogeneity: Convergence and Balancing Effect
Large Learning Rate Tames Homogeneity: Convergence and Balancing Effect
Yuqing Wang
Minshuo Chen
T. Zhao
Molei Tao
AI4CE
59
40
0
07 Oct 2021
A Differential Equation for Modeling Nesterov's Accelerated Gradient
  Method: Theory and Insights
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
Weijie Su
Stephen P. Boyd
Emmanuel J. Candes
108
1,157
0
04 Mar 2015
1