ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.09917
17
6

Neural networks with superexpressive activations and integer weights

20 May 2021
A. Beknazaryan
ArXivPDFHTML
Abstract

An example of an activation function σ\sigmaσ is given such that networks with activations {σ,⌊⋅⌋}\{\sigma, \lfloor\cdot\rfloor\}{σ,⌊⋅⌋}, integer weights and a fixed architecture depending on ddd approximate continuous functions on [0,1]d[0,1]^d[0,1]d. The range of integer weights required for ε\varepsilonε-approximation of H\"older continuous functions is derived, which leads to a convergence rate of order n−2β2β+dlog⁡2nn^{\frac{-2\beta}{2\beta+d}}\log_2nn2β+d−2β​log2​n for neural network regression estimation of unknown β\betaβ-H\"older continuous function with given nnn samples.

View on arXiv
Comments on this paper