ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.06467
12
0

Optimal Approximation with Sparse Neural Networks and Applications

14 August 2021
Khay Boon Hong
ArXivPDFHTML
Abstract

We use deep sparsely connected neural networks to measure the complexity of a function class in L2(Rd)L^2(\mathbb R^d)L2(Rd) by restricting connectivity and memory requirement for storing the neural networks. We also introduce representation system - a countable collection of functions to guide neural networks, since approximation theory with representation system has been well developed in Mathematics. We then prove the fundamental bound theorem, implying a quantity intrinsic to the function class itself can give information about the approximation ability of neural networks and representation system. We also provides a method for transferring existing theories about approximation by representation systems to that of neural networks, greatly amplifying the practical values of neural networks. Finally, we use neural networks to approximate B-spline functions, which are used to generate the B-spline curves. Then, we analyse the complexity of a class called β\betaβ cartoon-like functions using rate-distortion theory and wedgelets construction.

View on arXiv
Comments on this paper