ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.06655
49
81

Sparse Sliced Inverse Regression Via Lasso

21 November 2016
Q. Lin
Zhigen Zhao
Jun S. Liu
ArXivPDFHTML
Abstract

For multiple index models, it has recently been shown that the sliced inverse regression (SIR) is consistent for estimating the sufficient dimension reduction (SDR) space if and only if ρ=lim⁡pn=0\rho=\lim\frac{p}{n}=0ρ=limnp​=0, where ppp is the dimension and nnn is the sample size. Thus, when ppp is of the same or a higher order of nnn, additional assumptions such as sparsity must be imposed in order to ensure consistency for SIR. By constructing artificial response variables made up from top eigenvectors of the estimated conditional covariance matrix, we introduce a simple Lasso regression method to obtain an estimate of the SDR space. The resulting algorithm, Lasso-SIR, is shown to be consistent and achieve the optimal convergence rate under certain sparsity conditions when ppp is of order o(n2λ2)o(n^2\lambda^2)o(n2λ2), where λ\lambdaλ is the generalized signal-to-noise ratio. We also demonstrate the superior performance of Lasso-SIR compared with existing approaches via extensive numerical studies and several real data examples.

View on arXiv
Comments on this paper