ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.09653
  4. Cited By
Self-Consistent Dynamical Field Theory of Kernel Evolution in Wide
  Neural Networks

Self-Consistent Dynamical Field Theory of Kernel Evolution in Wide Neural Networks

19 May 2022
Blake Bordelon
Cengiz Pehlevan
    MLT
ArXivPDFHTML

Papers citing "Self-Consistent Dynamical Field Theory of Kernel Evolution in Wide Neural Networks"

11 / 61 papers shown
Title
The Onset of Variance-Limited Behavior for Networks in the Lazy and Rich
  Regimes
The Onset of Variance-Limited Behavior for Networks in the Lazy and Rich Regimes
Alexander B. Atanasov
Blake Bordelon
Sabarish Sainathan
Cengiz Pehlevan
22
26
0
23 Dec 2022
Infinite-width limit of deep linear neural networks
Infinite-width limit of deep linear neural networks
Lénaïc Chizat
Maria Colombo
Xavier Fernández-Real
Alessio Figalli
31
14
0
29 Nov 2022
Meta-Principled Family of Hyperparameter Scaling Strategies
Meta-Principled Family of Hyperparameter Scaling Strategies
Sho Yaida
58
16
0
10 Oct 2022
Second-order regression models exhibit progressive sharpening to the
  edge of stability
Second-order regression models exhibit progressive sharpening to the edge of stability
Atish Agarwala
Fabian Pedregosa
Jeffrey Pennington
35
26
0
10 Oct 2022
The Influence of Learning Rule on Representation Dynamics in Wide Neural
  Networks
The Influence of Learning Rule on Representation Dynamics in Wide Neural Networks
Blake Bordelon
Cengiz Pehlevan
41
22
0
05 Oct 2022
Decomposing neural networks as mappings of correlation functions
Decomposing neural networks as mappings of correlation functions
Kirsten Fischer
Alexandre René
Christian Keup
Moritz Layer
David Dahmen
M. Helias
FAtt
14
14
0
10 Feb 2022
The Eigenlearning Framework: A Conservation Law Perspective on Kernel
  Regression and Wide Neural Networks
The Eigenlearning Framework: A Conservation Law Perspective on Kernel Regression and Wide Neural Networks
James B. Simon
Madeline Dickens
Dhruva Karkada
M. DeWeese
48
27
0
08 Oct 2021
The large learning rate phase of deep learning: the catapult mechanism
The large learning rate phase of deep learning: the catapult mechanism
Aitor Lewkowycz
Yasaman Bahri
Ethan Dyer
Jascha Narain Sohl-Dickstein
Guy Gur-Ari
ODL
159
234
0
04 Mar 2020
Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural
  Networks
Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural Networks
Blake Bordelon
Abdulkadir Canatar
Cengiz Pehlevan
144
201
0
07 Feb 2020
Why bigger is not always better: on finite and infinite neural networks
Why bigger is not always better: on finite and infinite neural networks
Laurence Aitchison
175
51
0
17 Oct 2019
Trainability and Accuracy of Neural Networks: An Interacting Particle
  System Approach
Trainability and Accuracy of Neural Networks: An Interacting Particle System Approach
Grant M. Rotskoff
Eric Vanden-Eijnden
59
118
0
02 May 2018
Previous
12