Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1906.00193
Cited By
A mean-field limit for certain deep neural networks
1 June 2019
Dyego Araújo
R. Oliveira
Daniel Yukimura
AI4CE
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"A mean-field limit for certain deep neural networks"
50 / 56 papers shown
Title
Saddle-To-Saddle Dynamics in Deep ReLU Networks: Low-Rank Bias in the First Saddle Escape
Ioannis Bantzis
James B. Simon
Arthur Jacot
ODL
60
0
0
27 May 2025
Mean-Field Analysis for Learning Subspace-Sparse Polynomials with Gaussian Input
Ziang Chen
Rong Ge
MLT
154
1
0
10 Jan 2025
A Mean Field Ansatz for Zero-Shot Weight Transfer
Xingyuan Chen
Wenwei Kuang
Lei Deng
Wei Han
Bo Bai
Goncalo dos Reis
77
1
0
16 Aug 2024
Generalization of Scaled Deep ResNets in the Mean-Field Regime
Yihang Chen
Fanghui Liu
Yiping Lu
Grigorios G. Chrysos
Volkan Cevher
73
2
0
14 Mar 2024
Commutative Width and Depth Scaling in Deep Neural Networks
Soufiane Hayou
83
2
0
02 Oct 2023
Fundamental limits of overparametrized shallow neural networks for supervised learning
Francesco Camilli
D. Tieplova
Jean Barbier
72
10
0
11 Jul 2023
Neural Hilbert Ladders: Multi-Layer Neural Networks in Function Space
Zhengdao Chen
102
1
0
03 Jul 2023
Feature-Learning Networks Are Consistent Across Widths At Realistic Scales
Nikhil Vyas
Alexander B. Atanasov
Blake Bordelon
Depen Morwani
Sabarish Sainathan
Cengiz Pehlevan
131
26
0
28 May 2023
Dynamics of Finite Width Kernel and Prediction Fluctuations in Mean Field Neural Networks
Blake Bordelon
Cengiz Pehlevan
MLT
105
31
0
06 Apr 2023
Depth Separation with Multilayer Mean-Field Networks
Y. Ren
Mo Zhou
Rong Ge
OOD
85
3
0
03 Apr 2023
M22: A Communication-Efficient Algorithm for Federated Learning Inspired by Rate-Distortion
Yangyi Liu
Stefano Rini
Sadaf Salehkalaibar
Jun Chen
FedML
50
4
0
23 Jan 2023
Two-Scale Gradient Descent Ascent Dynamics Finds Mixed Nash Equilibria of Continuous Games: A Mean-Field Perspective
Yulong Lu
MLT
AI4CE
59
23
0
17 Dec 2022
Nonlinear controllability and function representation by neural stochastic differential equations
Tanya Veeravalli
Maxim Raginsky
DiffM
67
2
0
01 Dec 2022
A Functional-Space Mean-Field Theory of Partially-Trained Three-Layer Neural Networks
Zhengdao Chen
Eric Vanden-Eijnden
Joan Bruna
MLT
77
5
0
28 Oct 2022
Proximal Mean Field Learning in Shallow Neural Networks
Alexis M. H. Teter
Iman Nodozi
A. Halder
FedML
85
1
0
25 Oct 2022
Mean-field analysis for heavy ball methods: Dropout-stability, connectivity, and global convergence
Diyuan Wu
Vyacheslav Kungurtsev
Marco Mondelli
62
3
0
13 Oct 2022
Limitations of the NTK for Understanding Generalization in Deep Learning
Nikhil Vyas
Yamini Bansal
Preetum Nakkiran
118
34
0
20 Jun 2022
High-dimensional limit theorems for SGD: Effective dynamics and critical scaling
Gerard Ben Arous
Reza Gheissari
Aukosh Jagannath
140
59
0
08 Jun 2022
Self-Consistent Dynamical Field Theory of Kernel Evolution in Wide Neural Networks
Blake Bordelon
Cengiz Pehlevan
MLT
89
85
0
19 May 2022
On Feature Learning in Neural Networks with Global Convergence Guarantees
Zhengdao Chen
Eric Vanden-Eijnden
Joan Bruna
MLT
96
13
0
22 Apr 2022
How to Attain Communication-Efficient DNN Training? Convert, Compress, Correct
Zhongzhu Chen
Eduin E. Hernandez
Yu-Chih Huang
Stefano Rini
MQ
143
0
0
18 Apr 2022
Gradient flows on graphons: existence, convergence, continuity equations
Sewoong Oh
Soumik Pal
Raghav Somani
Raghavendra Tripathi
47
5
0
18 Nov 2021
Mean-field Analysis of Piecewise Linear Solutions for Wide ReLU Networks
Aleksandr Shevchenko
Vyacheslav Kungurtsev
Marco Mondelli
MLT
102
13
0
03 Nov 2021
A Riemannian Mean Field Formulation for Two-layer Neural Networks with Batch Normalization
Chao Ma
Lexing Ying
MLT
50
2
0
17 Oct 2021
Gradient Descent on Infinitely Wide Neural Networks: Global Convergence and Generalization
Francis R. Bach
Lénaïc Chizat
MLT
67
24
0
15 Oct 2021
Learning Mean-Field Equations from Particle Data Using WSINDy
Daniel Messenger
David M. Bortz
99
37
0
14 Oct 2021
On the Global Convergence of Gradient Descent for multi-layer ResNets in the mean-field regime
Zhiyan Ding
Shi Chen
Qin Li
S. Wright
MLT
AI4CE
95
11
0
06 Oct 2021
Overparameterization of deep ResNet: zero loss and mean-field analysis
Zhiyan Ding
Shi Chen
Qin Li
S. Wright
ODL
95
25
0
30 May 2021
Global Convergence of Three-layer Neural Networks in the Mean Field Regime
H. Pham
Phan-Minh Nguyen
MLT
AI4CE
91
19
0
11 May 2021
A Local Convergence Theory for Mildly Over-Parameterized Two-Layer Neural Network
Mo Zhou
Rong Ge
Chi Jin
145
46
0
04 Feb 2021
Particle Dual Averaging: Optimization of Mean Field Neural Networks with Global Convergence Rate Analysis
Atsushi Nitanda
Denny Wu
Taiji Suzuki
97
29
0
31 Dec 2020
Mathematical Models of Overparameterized Neural Networks
Cong Fang
Hanze Dong
Tong Zhang
181
23
0
27 Dec 2020
Feature Learning in Infinite-Width Neural Networks
Greg Yang
J. E. Hu
MLT
118
156
0
30 Nov 2020
Greedy Optimization Provably Wins the Lottery: Logarithmic Number of Winning Tickets is Enough
Mao Ye
Lemeng Wu
Qiang Liu
67
17
0
29 Oct 2020
A Dynamical Central Limit Theorem for Shallow Neural Networks
Zhengdao Chen
Grant M. Rotskoff
Joan Bruna
Eric Vanden-Eijnden
96
30
0
21 Aug 2020
On the Banach spaces associated with multi-layer ReLU networks: Function representation, approximation theory and gradient descent dynamics
E. Weinan
Stephan Wojtowytsch
MLT
75
53
0
30 Jul 2020
Modeling from Features: a Mean-field Framework for Over-parameterized Deep Neural Networks
Cong Fang
Jason D. Lee
Pengkun Yang
Tong Zhang
OOD
FedML
156
58
0
03 Jul 2020
Go Wide, Then Narrow: Efficient Training of Deep Thin Networks
Denny Zhou
Mao Ye
Chen Chen
Tianjian Meng
Mingxing Tan
Xiaodan Song
Quoc V. Le
Qiang Liu
Dale Schuurmans
63
20
0
01 Jul 2020
An analytic theory of shallow networks dynamics for hinge loss classification
Franco Pellegrini
Giulio Biroli
82
19
0
19 Jun 2020
A Note on the Global Convergence of Multilayer Neural Networks in the Mean Field Regime
H. Pham
Phan-Minh Nguyen
MLT
AI4CE
45
4
0
16 Jun 2020
Can Temporal-Difference and Q-Learning Learn Representation? A Mean-Field Theory
Yufeng Zhang
Qi Cai
Zhuoran Yang
Yongxin Chen
Zhaoran Wang
OOD
MLT
360
11
0
08 Jun 2020
On the Convergence of Gradient Descent Training for Two-layer ReLU-networks in the Mean Field Regime
Stephan Wojtowytsch
MLT
128
51
0
27 May 2020
Can Shallow Neural Networks Beat the Curse of Dimensionality? A mean field training perspective
Stephan Wojtowytsch
E. Weinan
MLT
77
51
0
21 May 2020
Predicting the outputs of finite deep neural networks trained with noisy gradients
Gadi Naveh
Oded Ben-David
H. Sompolinsky
Zohar Ringel
116
23
0
02 Apr 2020
A Mean-field Analysis of Deep ResNet and Beyond: Towards Provable Optimization Via Overparameterization From Depth
Yiping Lu
Chao Ma
Yulong Lu
Jianfeng Lu
Lexing Ying
MLT
160
79
0
11 Mar 2020
Good Subnetworks Provably Exist: Pruning via Greedy Forward Selection
Mao Ye
Chengyue Gong
Lizhen Nie
Denny Zhou
Adam R. Klivans
Qiang Liu
113
111
0
03 Mar 2020
A Rigorous Framework for the Mean Field Limit of Multilayer Neural Networks
Phan-Minh Nguyen
H. Pham
AI4CE
111
83
0
30 Jan 2020
Mean-Field and Kinetic Descriptions of Neural Differential Equations
Michael Herty
T. Trimborn
G. Visconti
122
6
0
07 Jan 2020
Machine Learning from a Continuous Viewpoint
E. Weinan
Chao Ma
Lei Wu
160
104
0
30 Dec 2019
Landscape Connectivity and Dropout Stability of SGD Solutions for Over-parameterized Neural Networks
Aleksandr Shevchenko
Marco Mondelli
196
38
0
20 Dec 2019
1
2
Next