ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 0812.2818
134
167
v1v2v3 (latest)

Sparse Recovery under Matrix Uncertainty

15 December 2008
M. Rosenbaum
Alexandre B. Tsybakov
ArXiv (abs)PDFHTML
Abstract

We consider the model y=X\theta+e, Z=X+v, where the n-dimensional random vector y and the n*p random matrix Z are observed, the n*p matrix X is unknown, v is an n*p random noise matrix, e is a noise independent of v, and \theta is a vector of unknown parameters to be estimated. The matrix uncertainty is in the fact that X is observed with additive error. For dimensions p that can be much larger than the sample size n we consider the estimation of sparse vectors \theta. Under the matrix uncertainty, the Lasso and Dantzig selector turn out to be extremely unstable in recovering the sparsity pattern (i.e., of the set of non-zero components of \theta), even if the noise level is very small. We suggest new estimators called the matrix uncertainty selectors (or shortly the MU-selectors) which are close to \theta in different norms and in the prediction risk if the Restricted eigenvalue assumption on X is satisfied. We also show that under somewhat stronger assumptions these estimators recover correctly the sparsity pattern.

View on arXiv
Comments on this paper