Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1708.02691
Cited By
Universal Function Approximation by Deep Neural Nets with Bounded Width and ReLU Activations
9 August 2017
Boris Hanin
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Universal Function Approximation by Deep Neural Nets with Bounded Width and ReLU Activations"
6 / 6 papers shown
Title
Time to Spike? Understanding the Representational Power of Spiking Neural Networks in Discrete Time
Duc Anh Nguyen
Ernesto Araya
Adalbert Fono
Gitta Kutyniok
71
0
0
23 May 2025
Minimum Description Length of a Spectrum Variational Autoencoder: A Theory
Canlin Zhang
Xiuwen Liu
78
0
0
01 Apr 2025
Fundamental Limits of Deep Learning-Based Binary Classifiers Trained with Hinge Loss
T. Getu
Georges Kaddoum
M. Bennis
52
1
0
13 Sep 2023
Error bounds for approximations with deep ReLU neural networks in
W
s
,
p
W^{s,p}
W
s
,
p
norms
Ingo Gühring
Gitta Kutyniok
P. Petersen
55
200
0
21 Feb 2019
Exponential expressivity in deep neural networks through transient chaos
Ben Poole
Subhaneil Lahiri
M. Raghu
Jascha Narain Sohl-Dickstein
Surya Ganguli
76
587
0
16 Jun 2016
Learning Functions: When Is Deep Better Than Shallow
H. Mhaskar
Q. Liao
T. Poggio
48
144
0
03 Mar 2016
1