ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.08061
16
4

Sparse recovery via nonconvex regularized MMM-estimators over ℓq\ell_qℓq​-balls

19 November 2019
Xin Li
Dongya Wu
Chong Li
Jinhua Wang
J. Yao
    FedML
ArXivPDFHTML
Abstract

In this paper, we analyse the recovery properties of nonconvex regularized MMM-estimators, under the assumption that the true parameter is of soft sparsity. In the statistical aspect, we establish the recovery bound for any stationary point of the nonconvex regularized MMM-estimator, under restricted strong convexity and some regularity conditions on the loss function and the regularizer, respectively. In the algorithmic aspect, we slightly decompose the objective function and then solve the nonconvex optimization problem via the proximal gradient method, which is proved to achieve a linear convergence rate. In particular, we note that for commonly-used regularizers such as SCAD and MCP, a simpler decomposition is applicable thanks to our assumption on the regularizer, which helps to construct the estimator with better recovery performance. Finally, we demonstrate our theoretical consequences and the advantage of the assumption by several numerical experiments on the corrupted errors-in-variables linear regression model. Simulation results show remarkable consistency with our theory under high-dimensional scaling.

View on arXiv
Comments on this paper