ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.10949
16
6

Refined Least Squares for Support Recovery

19 March 2021
Ofir Lindenbaum
Stefan Steinerberger
ArXivPDFHTML
Abstract

We study the problem of exact support recovery based on noisy observations and present Refined Least Squares (RLS). Given a set of noisy measurement \myvec{y} = \myvec{X}\myvec{\theta}^* + \myvec{\omega}, and \myvecX∈RN×D\myvec{X} \in \mathbb{R}^{N \times D}\myvecX∈RN×D which is a (known) Gaussian matrix and \myvecω∈RN\myvec{\omega} \in \mathbb{R}^N\myvecω∈RN is an (unknown) Gaussian noise vector, our goal is to recover the support of the (unknown) sparse vector \myvecθ∗∈{−1,0,1}D\myvec{\theta}^* \in \left\{-1,0,1\right\}^D\myvecθ∗∈{−1,0,1}D. To recover the support of the \myvecθ∗\myvec{\theta}^*\myvecθ∗ we use an average of multiple least squares solutions, each computed based on a subset of the full set of equations. The support is estimated by identifying the most significant coefficients of the average least squares solution. We demonstrate that in a wide variety of settings our method outperforms state-of-the-art support recovery algorithms.

View on arXiv
Comments on this paper