ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 0805.3224
92
43

Consistent selection via the Lasso for high dimensional approximating regression models

21 May 2008
F. Bunea
ArXivPDFHTML
Abstract

In this article we investigate consistency of selection in regression models via the popular Lasso method. Here we depart from the traditional linear regression assumption and consider approximations of the regression function fff with elements of a given dictionary of MMM functions. The target for consistency is the index set of those functions from this dictionary that realize the most parsimonious approximation to fff among all linear combinations belonging to an L2L_2L2​ ball centered at fff and of radius rn,M2r_{n,M}^2rn,M2​. In this framework we show that a consistent estimate of this index set can be derived via ℓ1\ell_1ℓ1​ penalized least squares, with a data dependent penalty and with tuning sequence rn,M>log⁡(Mn)/nr_{n,M}>\sqrt{\log(Mn)/n}rn,M​>log(Mn)/n​, where nnn is the sample size. Our results hold for any 1≤M≤nγ1\leq M\leq n^{\gamma}1≤M≤nγ, for any γ>0\gamma>0γ>0.

View on arXiv
Comments on this paper