ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.02655
13
43

Sparse and Smooth Signal Estimation: Convexification of L0 Formulations

6 November 2018
Alper Atamtürk
A. Gómez
Shaoning Han
ArXivPDFHTML
Abstract

Signal estimation problems with smoothness and sparsity priors can be naturally modeled as quadratic optimization with ℓ0\ell_0ℓ0​-"norm" constraints. Since such problems are non-convex and hard-to-solve, the standard approach is, instead, to tackle their convex surrogates based on ℓ1\ell_1ℓ1​-norm relaxations. In this paper, we propose a new iterative (convex) conic quadratic relaxations that exploit not only the ℓ0\ell_0ℓ0​-"norm" terms, but also the fitness and smoothness functions. The iterative convexification approach substantially closes the gap between the ℓ0\ell_0ℓ0​-"norm" and its ℓ1\ell_1ℓ1​ surrogate. These stronger relaxations lead to significantly better estimators than ℓ1\ell_1ℓ1​-norm approaches and also allow one to utilize affine sparsity priors. In addition, the parameters of the model and the resulting estimators are easily interpretable. Experiments with a tailored Lagrangian decomposition method indicate that the proposed iterative convex relaxations \rev{yield solutions within 1\% of the exact ℓ0\ell_0ℓ0​ approach, and can tackle instances with up to 100,000 variables under one minute.

View on arXiv
Comments on this paper