ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2304.04443
13
14

Approximation of Nonlinear Functionals Using Deep ReLU Networks

10 April 2023
Linhao Song
Jun Fan
Dirong Chen
Ding-Xuan Zhou
ArXivPDFHTML
Abstract

In recent years, functional neural networks have been proposed and studied in order to approximate nonlinear continuous functionals defined on Lp([−1,1]s)L^p([-1, 1]^s)Lp([−1,1]s) for integers s≥1s\ge1s≥1 and 1≤p<∞1\le p<\infty1≤p<∞. However, their theoretical properties are largely unknown beyond universality of approximation or the existing analysis does not apply to the rectified linear unit (ReLU) activation function. To fill in this void, we investigate here the approximation power of functional deep neural networks associated with the ReLU activation function by constructing a continuous piecewise linear interpolation under a simple triangulation. In addition, we establish rates of approximation of the proposed functional deep ReLU networks under mild regularity conditions. Finally, our study may also shed some light on the understanding of functional data learning algorithms.

View on arXiv
Comments on this paper