ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.02035
27
1

Exponential Expressivity of ReLUk^kk Neural Networks on Gevrey Classes with Point Singularities

4 March 2024
J. Opschoor
Christoph Schwab
ArXivPDFHTML
Abstract

We analyze deep Neural Network emulation rates of smooth functions with point singularities in bounded, polytopal domains D⊂Rd\mathrm{D} \subset \mathbb{R}^dD⊂Rd, d=2,3d=2,3d=2,3. We prove exponential emulation rates in Sobolev spaces in terms of the number of neurons and in terms of the number of nonzero coefficients for Gevrey-regular solution classes defined in terms of weighted Sobolev scales in D\mathrm{D}D, comprising the countably-normed spaces of I.M. Babu\v{s}ka and B.Q. Guo. As intermediate result, we prove that continuous, piecewise polynomial high order (``ppp-version'') finite elements with elementwise polynomial degree p∈Np\in\mathbb{N}p∈N on arbitrary, regular, simplicial partitions of polyhedral domains D⊂Rd\mathrm{D} \subset \mathbb{R}^dD⊂Rd, d≥2d\geq 2d≥2 can be exactly emulated by neural networks combining ReLU and ReLU2^22 activations. On shape-regular, simplicial partitions of polytopal domains D\mathrm{D}D, both the number of neurons and the number of nonzero parameters are proportional to the number of degrees of freedom of the finite element space, in particular for the hphphp-Finite Element Method of I.M. Babu\v{s}ka and B.Q. Guo.

View on arXiv
Comments on this paper