ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.03040
33
247

Deep Network Approximation for Smooth Functions

9 January 2020
Jianfeng Lu
Zuowei Shen
Haizhao Yang
Shijun Zhang
ArXivPDFHTML
Abstract

This paper establishes the (nearly) optimal approximation error characterization of deep rectified linear unit (ReLU) networks for smooth functions in terms of both width and depth simultaneously. To that end, we first prove that multivariate polynomials can be approximated by deep ReLU networks of width O(N)\mathcal{O}(N)O(N) and depth O(L)\mathcal{O}(L)O(L) with an approximation error O(N−L)\mathcal{O}(N^{-L})O(N−L). Through local Taylor expansions and their deep ReLU network approximations, we show that deep ReLU networks of width O(Nln⁡N)\mathcal{O}(N\ln N)O(NlnN) and depth O(Lln⁡L)\mathcal{O}(L\ln L)O(LlnL) can approximate f∈Cs([0,1]d)f\in C^s([0,1]^d)f∈Cs([0,1]d) with a nearly optimal approximation error O(∥f∥Cs([0,1]d)N−2s/dL−2s/d)\mathcal{O}(\|f\|_{C^s([0,1]^d)}N^{-2s/d}L^{-2s/d})O(∥f∥Cs([0,1]d)​N−2s/dL−2s/d). Our estimate is non-asymptotic in the sense that it is valid for arbitrary width and depth specified by N∈N+N\in\mathbb{N}^+N∈N+ and L∈N+L\in\mathbb{N}^+L∈N+, respectively.

View on arXiv
Comments on this paper