ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1106.4293
175
66
v1v2v3v4 (latest)

Tight conditions for consistency of variable selection in the context of high dimensionality

21 June 2011
L. Comminges
A. Dalalyan
ArXiv (abs)PDFHTML
Abstract

We address the issue of variable selection in the regression model with very high ambient dimension, \textit i.e., when the number of variables is very large. The main focus is on the situation where the number of relevant variables, called intrinsic dimension and denoted by d∗d^*d∗, is much smaller than the ambient dimension ddd. Without assuming any parametric form of the underlying regression function, we get tight conditions making it possible to consistently estimate the set of relevant variables. These conditions relate the intrinsic dimension to the ambient dimension and to the sample size. The procedure that is provably consistent under these tight conditions is based on comparing quadratic functionals of the empirical Fourier coefficients with appropriately chosen threshold values. The asymptotic analysis reveals the presence of two quite different regimes. The first regime is when d∗d^*d∗ is fixed. In this case the situation in nonparametric regression is the same as in linear regression, \textit i.e., consistent variable selection is possible if and only if log⁡d\log dlogd is small compared to the sample size nnn. The picture is different in the second regime, d∗→∞d^*\to\inftyd∗→∞ as n→∞n\to\inftyn→∞, where we prove that consistent variable selection in nonparametric set-up is possible only if d∗+log⁡log⁡dd^*+\log\log dd∗+loglogd is small compared to log⁡n\log nlogn. We apply these results to derive minimax separation rates for the problem of variable selection.

View on arXiv
Comments on this paper