Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2004.01190
Cited By
Predicting the outputs of finite deep neural networks trained with noisy gradients
2 April 2020
Gadi Naveh
Oded Ben-David
H. Sompolinsky
Z. Ringel
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Predicting the outputs of finite deep neural networks trained with noisy gradients"
11 / 11 papers shown
Title
Grokking as a First Order Phase Transition in Two Layer Networks
Noa Rubin
Inbar Seroussi
Z. Ringel
37
15
0
05 Oct 2023
Dynamics of Finite Width Kernel and Prediction Fluctuations in Mean Field Neural Networks
Blake Bordelon
Cengiz Pehlevan
MLT
38
29
0
06 Apr 2023
Online Learning for the Random Feature Model in the Student-Teacher Framework
Roman Worschech
B. Rosenow
46
0
0
24 Mar 2023
Globally Gated Deep Linear Networks
Qianyi Li
H. Sompolinsky
AI4CE
27
10
0
31 Oct 2022
Self-Consistent Dynamical Field Theory of Kernel Evolution in Wide Neural Networks
Blake Bordelon
Cengiz Pehlevan
MLT
40
78
0
19 May 2022
Contrasting random and learned features in deep Bayesian linear regression
Jacob A. Zavatone-Veth
William L. Tong
Cengiz Pehlevan
BDL
MLT
28
26
0
01 Mar 2022
Separation of Scales and a Thermodynamic Description of Feature Learning in Some CNNs
Inbar Seroussi
Gadi Naveh
Z. Ringel
33
50
0
31 Dec 2021
Unified field theoretical approach to deep and recurrent neuronal networks
Kai Segadlo
Bastian Epping
Alexander van Meegen
David Dahmen
Michael Krämer
M. Helias
AI4CE
BDL
37
20
0
10 Dec 2021
The large learning rate phase of deep learning: the catapult mechanism
Aitor Lewkowycz
Yasaman Bahri
Ethan Dyer
Jascha Narain Sohl-Dickstein
Guy Gur-Ari
ODL
159
234
0
04 Mar 2020
Double Trouble in Double Descent : Bias and Variance(s) in the Lazy Regime
Stéphane dÁscoli
Maria Refinetti
Giulio Biroli
Florent Krzakala
93
152
0
02 Mar 2020
MCMC using Hamiltonian dynamics
Radford M. Neal
185
3,266
0
09 Jun 2012
1