Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1906.02341
Cited By
How to Initialize your Network? Robust Initialization for WeightNorm & ResNets
5 June 2019
Devansh Arpit
Victor Campos
Yoshua Bengio
Re-assign community
ArXiv
PDF
HTML
Papers citing
"How to Initialize your Network? Robust Initialization for WeightNorm & ResNets"
9 / 9 papers shown
Title
On the Lipschitz Constant of Deep Networks and Double Descent
Matteo Gamba
Hossein Azizpour
Mårten Björkman
33
7
0
28 Jan 2023
Deep Double Descent via Smooth Interpolation
Matteo Gamba
Erik Englesson
Mårten Björkman
Hossein Azizpour
63
11
0
21 Sep 2022
When Bioprocess Engineering Meets Machine Learning: A Survey from the Perspective of Automated Bioprocess Development
Nghia Duong-Trung
Stefan Born
Jong Woo Kim
M. Schermeyer
Katharina Paulick
...
Thorben Werner
Randolf Scholz
Lars Schmidt-Thieme
Peter Neubauer
Ernesto Martinez
34
20
0
02 Sep 2022
Scaling ResNets in the Large-depth Regime
Pierre Marion
Adeline Fermanian
Gérard Biau
Jean-Philippe Vert
26
16
0
14 Jun 2022
Entangled Residual Mappings
Mathias Lechner
Ramin Hasani
Z. Babaiee
Radu Grosu
Daniela Rus
T. Henzinger
Sepp Hochreiter
14
5
0
02 Jun 2022
AutoInit: Analytic Signal-Preserving Weight Initialization for Neural Networks
G. Bingham
Risto Miikkulainen
ODL
24
4
0
18 Sep 2021
The Future is Log-Gaussian: ResNets and Their Infinite-Depth-and-Width Limit at Initialization
Mufan Li
Mihai Nica
Daniel M. Roy
32
33
0
07 Jun 2021
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
244
349
0
14 Jun 2018
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
308
2,892
0
15 Sep 2016
1