ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.07883
6
3

Learning functions varying along a central subspace

22 January 2020
Hao Liu
Wenjing Liao
ArXivPDFHTML
Abstract

Many functions of interest are in a high-dimensional space but exhibit low-dimensional structures. This paper studies regression of a sss-H\"{o}lder function fff in RD\mathbb{R}^DRD which varies along a central subspace of dimension ddd while d≪Dd\ll Dd≪D. A direct approximation of fff in RD\mathbb{R}^DRD with an ε\varepsilonε accuracy requires the number of samples nnn in the order of ε−(2s+D)/s\varepsilon^{-(2s+D)/s}ε−(2s+D)/s. In this paper, we analyze the Generalized Contour Regression (GCR) algorithm for the estimation of the central subspace and use piecewise polynomials for function approximation. GCR is among the best estimators for the central subspace, but its sample complexity is an open question. We prove that GCR leads to a mean squared estimation error of O(n−1)O(n^{-1})O(n−1) for the central subspace, if a variance quantity is exactly known. The estimation error of this variance quantity is also given in this paper. The mean squared regression error of fff is proved to be in the order of (n/log⁡n)−2s2s+d\left(n/\log n\right)^{-\frac{2s}{2s+d}}(n/logn)−2s+d2s​ where the exponent depends on the dimension of the central subspace ddd instead of the ambient space DDD. This result demonstrates that GCR is effective in learning the low-dimensional central subspace. We also propose a modified GCR with improved efficiency. The convergence rate is validated through several numerical experiments.

View on arXiv
Comments on this paper