ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.00834
  4. Cited By
Sharp Lower Bounds on Interpolation by Deep ReLU Neural Networks at
  Irregularly Spaced Data

Sharp Lower Bounds on Interpolation by Deep ReLU Neural Networks at Irregularly Spaced Data

2 February 2023
Jonathan W. Siegel
ArXivPDFHTML

Papers citing "Sharp Lower Bounds on Interpolation by Deep ReLU Neural Networks at Irregularly Spaced Data"

4 / 4 papers shown
Title
On the Optimal Expressive Power of ReLU DNNs and Its Application in
  Approximation with Kolmogorov Superposition Theorem
On the Optimal Expressive Power of ReLU DNNs and Its Application in Approximation with Kolmogorov Superposition Theorem
Juncai He
21
10
0
10 Aug 2023
Optimal Approximation Rate of ReLU Networks in terms of Width and Depth
Optimal Approximation Rate of ReLU Networks in terms of Width and Depth
Zuowei Shen
Haizhao Yang
Shijun Zhang
101
115
0
28 Feb 2021
Approximation by Combinations of ReLU and Squared ReLU Ridge Functions
  with $ \ell^1 $ and $ \ell^0 $ Controls
Approximation by Combinations of ReLU and Squared ReLU Ridge Functions with ℓ1 \ell^1 ℓ1 and ℓ0 \ell^0 ℓ0 Controls
Jason M. Klusowski
Andrew R. Barron
130
142
0
26 Jul 2016
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
148
602
0
14 Feb 2016
1