ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1908.00695
17
92

Deep ReLU network approximation of functions on a manifold

2 August 2019
Johannes Schmidt-Hieber
ArXivPDFHTML
Abstract

Whereas recovery of the manifold from data is a well-studied topic, approximation rates for functions defined on manifolds are less known. In this work, we study a regression problem with inputs on a d∗d^*d∗-dimensional manifold that is embedded into a space with potentially much larger ambient dimension. It is shown that sparsely connected deep ReLU networks can approximate a H\"older function with smoothness index β\betaβ up to error ϵ\epsilonϵ using of the order of ϵ−d∗/βlog⁡(1/ϵ)\epsilon^{-d^*/\beta}\log(1/\epsilon)ϵ−d∗/βlog(1/ϵ) many non-zero network parameters. As an application, we derive statistical convergence rates for the estimator minimizing the empirical risk over all possible choices of bounded network parameters.

View on arXiv
Comments on this paper