ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.17991
18
0

Dimension-independent learning rates for high-dimensional classification problems

26 September 2024
Andrés Felipe Lerma Pineda
P. Petersen
Simon Frieder
Thomas Lukasiewicz
ArXivPDFHTML
Abstract

We study the problem of approximating and estimating classification functions that have their decision boundary in the RBV2RBV^2RBV2 space. Functions of RBV2RBV^2RBV2 type arise naturally as solutions of regularized neural network learning problems and neural networks can approximate these functions without the curse of dimensionality. We modify existing results to show that every RBV2RBV^2RBV2 function can be approximated by a neural network with bounded weights. Thereafter, we prove the existence of a neural network with bounded weights approximating a classification function. And we leverage these bounds to quantify the estimation rates. Finally, we present a numerical study that analyzes the effect of different regularity conditions on the decision boundaries.

View on arXiv
Comments on this paper