Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1806.01528
Cited By
The universal approximation power of finite-width deep ReLU networks
5 June 2018
Dmytro Perekrestenko
Philipp Grohs
Dennis Elbrächter
Helmut Bölcskei
Re-assign community
ArXiv
PDF
HTML
Papers citing
"The universal approximation power of finite-width deep ReLU networks"
8 / 8 papers shown
Title
Training Neural Networks Using Reproducing Kernel Space Interpolation and Model Reduction
Eric A. Werneburg
21
0
0
31 Aug 2023
Data Topology-Dependent Upper Bounds of Neural Network Widths
Sangmin Lee
Jong Chul Ye
26
0
0
25 May 2023
Seeking Interpretability and Explainability in Binary Activated Neural Networks
Benjamin Leblanc
Pascal Germain
FAtt
40
1
0
07 Sep 2022
Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation
Arnulf Jentzen
Timo Welti
17
15
0
03 Mar 2020
Growing axons: greedy learning of neural networks with application to function approximation
Daria Fokina
Ivan Oseledets
21
18
0
28 Oct 2019
The Oracle of DLphi
Dominik Alfke
W. Baines
J. Blechschmidt
Mauricio J. del Razo Sarmina
Amnon Drory
...
L. Thesing
Philipp Trunschke
Johannes von Lindheim
David Weber
Melanie Weber
39
0
0
17 Jan 2019
A proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients
Arnulf Jentzen
Diyora Salimova
Timo Welti
AI4CE
16
116
0
19 Sep 2018
A proof that artificial neural networks overcome the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations
Philipp Grohs
F. Hornung
Arnulf Jentzen
Philippe von Wurstemberger
11
167
0
07 Sep 2018
1