Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2204.11231
Cited By
Do ReLU Networks Have An Edge When Approximating Compactly-Supported Functions?
24 April 2022
Anastasis Kratsios
Behnoosh Zamanlooy
MLT
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Do ReLU Networks Have An Edge When Approximating Compactly-Supported Functions?"
6 / 6 papers shown
Title
Approximation Rates and VC-Dimension Bounds for (P)ReLU MLP Mixture of Experts
Anastasis Kratsios
Haitz Sáez de Ocáriz Borde
Takashi Furuya
Marc T. Law
MoE
41
1
0
05 Feb 2024
Data Topology-Dependent Upper Bounds of Neural Network Widths
Sangmin Lee
Jong Chul Ye
26
0
0
25 May 2023
Designing Universal Causal Deep Learning Models: The Case of Infinite-Dimensional Dynamical Systems from Stochastic Analysis
Luca Galimberti
Anastasis Kratsios
Giulia Livieri
OOD
28
14
0
24 Oct 2022
Universal Approximation Under Constraints is Possible with Transformers
Anastasis Kratsios
Behnoosh Zamanlooy
Tianlin Liu
Ivan Dokmanić
53
26
0
07 Oct 2021
Rapid training of deep neural networks without skip connections or normalization layers using Deep Kernel Shaping
James Martens
Andy Ballard
Guillaume Desjardins
G. Swirszcz
Valentin Dalibard
Jascha Narain Sohl-Dickstein
S. Schoenholz
88
43
0
05 Oct 2021
Optimal Approximation Rate of ReLU Networks in terms of Width and Depth
Zuowei Shen
Haizhao Yang
Shijun Zhang
101
115
0
28 Feb 2021
1