Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.12960
Cited By
Ridgeless Interpolation with Shallow ReLU Networks in
1
D
1D
1
D
is Nearest Neighbor Curvature Extrapolation and Provably Generalizes on Lipschitz Functions
27 September 2021
Boris Hanin
MLT
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Ridgeless Interpolation with Shallow ReLU Networks in $1D$ is Nearest Neighbor Curvature Extrapolation and Provably Generalizes on Lipschitz Functions"
9 / 9 papers shown
Title
The Effects of Multi-Task Learning on ReLU Neural Network Functions
Julia B. Nakhleh
Joseph Shenouda
Robert D. Nowak
34
1
0
29 Oct 2024
How do Minimum-Norm Shallow Denoisers Look in Function Space?
Chen Zeno
Greg Ongie
Yaniv Blumenfeld
Nir Weinberger
Daniel Soudry
20
8
0
12 Nov 2023
Minimum norm interpolation by perceptra: Explicit regularization and implicit bias
Jiyoung Park
Ian Pelakh
Stephan Wojtowytsch
42
2
0
10 Nov 2023
Noisy Interpolation Learning with Shallow Univariate ReLU Networks
Nirmit Joshi
Gal Vardi
Nathan Srebro
32
8
0
28 Jul 2023
Optimal bump functions for shallow ReLU networks: Weight decay, depth separation and the curse of dimensionality
Stephan Wojtowytsch
22
1
0
02 Sep 2022
Intrinsic dimensionality and generalization properties of the
R
\mathcal{R}
R
-norm inductive bias
Navid Ardeshir
Daniel J. Hsu
Clayton Sanford
CML
AI4CE
18
6
0
10 Jun 2022
On the Effective Number of Linear Regions in Shallow Univariate ReLU Networks: Convergence Guarantees and Implicit Bias
Itay Safran
Gal Vardi
Jason D. Lee
MLT
56
23
0
18 May 2022
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
222
348
0
14 Jun 2018
Norm-Based Capacity Control in Neural Networks
Behnam Neyshabur
Ryota Tomioka
Nathan Srebro
119
577
0
27 Feb 2015
1