52
0

Blessing of Dimensionality for Approximating Sobolev Classes on Manifolds

Abstract

The manifold hypothesis says that natural high-dimensional data lie on or around a low-dimensional manifold. The recent success of statistical and learning-based methods in very high dimensions empirically supports this hypothesis, suggesting that typical worst-case analysis does not provide practical guarantees. A natural step for analysis is thus to assume the manifold hypothesis and derive bounds that are independent of any ambient dimensions that the data may be embedded in. Theoretical implications in this direction have recently been explored in terms of generalization of ReLU networks and convergence of Langevin methods. In this work, we consider optimal uniform approximations with functions of finite statistical complexity. While upper bounds on uniform approximation exist in the literature using ReLU neural networks, we consider the opposite: lower bounds to quantify the fundamental difficulty of approximation on manifolds. In particular, we demonstrate that the statistical complexity required to approximate a class of bounded Sobolev functions on a compact manifold is bounded from below, and moreover that this bound is dependent only on the intrinsic properties of the manifold, such as curvature, volume, and injectivity radius.

View on arXiv
@article{tan2025_2408.06996,
  title={ Blessing of Dimensionality for Approximating Sobolev Classes on Manifolds },
  author={ Hong Ye Tan and Subhadip Mukherjee and Junqi Tang and Carola-Bibiane Schönlieb },
  journal={arXiv preprint arXiv:2408.06996},
  year={ 2025 }
}
Comments on this paper