ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.02832
20
33

Besov Function Approximation and Binary Classification on Low-Dimensional Manifolds Using Convolutional Residual Networks

7 September 2021
Hao Liu
Minshuo Chen
T. Zhao
Wenjing Liao
ArXivPDFHTML
Abstract

Most of existing statistical theories on deep neural networks have sample complexities cursed by the data dimension and therefore cannot well explain the empirical success of deep learning on high-dimensional data. To bridge this gap, we propose to exploit low-dimensional geometric structures of the real world data sets. We establish theoretical guarantees of convolutional residual networks (ConvResNet) in terms of function approximation and statistical estimation for binary classification. Specifically, given the data lying on a ddd-dimensional manifold isometrically embedded in RD\mathbb{R}^DRD, we prove that if the network architecture is properly chosen, ConvResNets can (1) approximate Besov functions on manifolds with arbitrary accuracy, and (2) learn a classifier by minimizing the empirical logistic risk, which gives an excess risk in the order of n−s2s+2(s∨d)n^{-\frac{s}{2s+2(s\vee d)}}n−2s+2(s∨d)s​, where sss is a smoothness parameter. This implies that the sample complexity depends on the intrinsic dimension ddd, instead of the data dimension DDD. Our results demonstrate that ConvResNets are adaptive to low-dimensional structures of data sets.

View on arXiv
Comments on this paper