ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.05073
14
0

Sparse estimation via ℓq\ell_qℓq​ optimization method in high-dimensional linear regression

12 November 2019
X. Li
Yaohua Hu
Chong Li
Xiaoqi Yang
T. Jiang
ArXivPDFHTML
Abstract

In this paper, we discuss the statistical properties of the ℓq\ell_qℓq​ optimization methods (0<q≤1)(0<q\leq 1)(0<q≤1), including the ℓq\ell_qℓq​ minimization method and the ℓq\ell_qℓq​ regularization method, for estimating a sparse parameter from noisy observations in high-dimensional linear regression with either a deterministic or random design. For this purpose, we introduce a general qqq-restricted eigenvalue condition (REC) and provide its sufficient conditions in terms of several widely-used regularity conditions such as sparse eigenvalue condition, restricted isometry property, and mutual incoherence property. By virtue of the qqq-REC, we exhibit the stable recovery property of the ℓq\ell_qℓq​ optimization methods for either deterministic or random designs by showing that the ℓ2\ell_2ℓ2​ recovery bound O(ϵ2)O(\epsilon^2)O(ϵ2) for the ℓq\ell_qℓq​ minimization method and the oracle inequality and ℓ2\ell_2ℓ2​ recovery bound O(λ22−qs)O(\lambda^{\frac{2}{2-q}}s)O(λ2−q2​s) for the ℓq\ell_qℓq​ regularization method hold respectively with high probability. The results in this paper are nonasymptotic and only assume the weak qqq-REC. The preliminary numerical results verify the established statistical property and demonstrate the advantages of the ℓq\ell_qℓq​ regularization method over some existing sparse optimization methods.

View on arXiv
Comments on this paper