Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2104.02746
Cited By
Proof of the Theory-to-Practice Gap in Deep Learning via Sampling Complexity bounds for Neural Network Approximation Spaces
6 April 2021
Philipp Grohs
F. Voigtlaender
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Proof of the Theory-to-Practice Gap in Deep Learning via Sampling Complexity bounds for Neural Network Approximation Spaces"
8 / 8 papers shown
Title
High-dimensional classification problems with Barron regular boundaries under margin conditions
Jonathan García
Philipp Petersen
74
1
0
10 Dec 2024
Algorithmically Designed Artificial Neural Networks (ADANNs): Higher order deep operator learning for parametric partial differential equations
Arnulf Jentzen
Adrian Riekert
Philippe von Wurstemberger
29
1
0
07 Feb 2023
Approximation results for Gradient Descent trained Shallow Neural Networks in
1
d
1d
1
d
R. Gentile
G. Welper
ODL
52
6
0
17 Sep 2022
Constrained Few-Shot Learning: Human-Like Low Sample Complexity Learning and Non-Episodic Text Classification
Jaron Mar
Jiamou Liu
30
1
0
17 Aug 2022
Learning ReLU networks to high uniform accuracy is intractable
Julius Berner
Philipp Grohs
F. Voigtlaender
32
4
0
26 May 2022
Sobolev-type embeddings for neural network approximation spaces
Philipp Grohs
F. Voigtlaender
16
1
0
28 Oct 2021
Deep neural network solution of the electronic Schrödinger equation
J. Hermann
Zeno Schätzle
Frank Noé
149
446
0
16 Sep 2019
Benefits of depth in neural networks
Matus Telgarsky
148
602
0
14 Feb 2016
1