ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.01528
  4. Cited By
The universal approximation power of finite-width deep ReLU networks

The universal approximation power of finite-width deep ReLU networks

5 June 2018
Dmytro Perekrestenko
Philipp Grohs
Dennis Elbrächter
Helmut Bölcskei
ArXivPDFHTML

Papers citing "The universal approximation power of finite-width deep ReLU networks"

8 / 8 papers shown
Title
Training Neural Networks Using Reproducing Kernel Space Interpolation
  and Model Reduction
Training Neural Networks Using Reproducing Kernel Space Interpolation and Model Reduction
Eric A. Werneburg
21
0
0
31 Aug 2023
Data Topology-Dependent Upper Bounds of Neural Network Widths
Data Topology-Dependent Upper Bounds of Neural Network Widths
Sangmin Lee
Jong Chul Ye
26
0
0
25 May 2023
Seeking Interpretability and Explainability in Binary Activated Neural
  Networks
Seeking Interpretability and Explainability in Binary Activated Neural Networks
Benjamin Leblanc
Pascal Germain
FAtt
40
1
0
07 Sep 2022
Overall error analysis for the training of deep neural networks via
  stochastic gradient descent with random initialisation
Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation
Arnulf Jentzen
Timo Welti
17
15
0
03 Mar 2020
Growing axons: greedy learning of neural networks with application to
  function approximation
Growing axons: greedy learning of neural networks with application to function approximation
Daria Fokina
Ivan Oseledets
21
18
0
28 Oct 2019
The Oracle of DLphi
The Oracle of DLphi
Dominik Alfke
W. Baines
J. Blechschmidt
Mauricio J. del Razo Sarmina
Amnon Drory
...
L. Thesing
Philipp Trunschke
Johannes von Lindheim
David Weber
Melanie Weber
39
0
0
17 Jan 2019
A proof that deep artificial neural networks overcome the curse of
  dimensionality in the numerical approximation of Kolmogorov partial
  differential equations with constant diffusion and nonlinear drift
  coefficients
A proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients
Arnulf Jentzen
Diyora Salimova
Timo Welti
AI4CE
16
116
0
19 Sep 2018
A proof that artificial neural networks overcome the curse of
  dimensionality in the numerical approximation of Black-Scholes partial
  differential equations
A proof that artificial neural networks overcome the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations
Philipp Grohs
F. Hornung
Arnulf Jentzen
Philippe von Wurstemberger
11
167
0
07 Sep 2018
1