ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.12073
21
4

One-pass additive-error subset selection for ℓp\ell_{p}ℓp​ subspace approximation

26 April 2022
Amit Deshpande
Rameshwar Pratap
ArXivPDFHTML
Abstract

We consider the problem of subset selection for ℓp\ell_{p}ℓp​ subspace approximation, that is, to efficiently find a \emph{small} subset of data points such that solving the problem optimally for this subset gives a good approximation to solving the problem optimally for the original input. Previously known subset selection algorithms based on volume sampling and adaptive sampling \cite{DeshpandeV07}, for the general case of p∈[1,∞)p \in [1, \infty)p∈[1,∞), require multiple passes over the data. In this paper, we give a one-pass subset selection with an additive approximation guarantee for ℓp\ell_{p}ℓp​ subspace approximation, for any p∈[1,∞)p \in [1, \infty)p∈[1,∞). Earlier subset selection algorithms that give a one-pass multiplicative (1+ϵ)(1+\epsilon)(1+ϵ) approximation work under the special cases. Cohen \textit{et al.} \cite{CohenMM17} gives a one-pass subset section that offers multiplicative (1+ϵ)(1+\epsilon)(1+ϵ) approximation guarantee for the special case of ℓ2\ell_{2}ℓ2​ subspace approximation. Mahabadi \textit{et al.} \cite{MahabadiRWZ20} gives a one-pass \emph{noisy} subset selection with (1+ϵ)(1+\epsilon)(1+ϵ) approximation guarantee for ℓp\ell_{p}ℓp​ subspace approximation when p∈{1,2}p \in \{1, 2\}p∈{1,2}. Our subset selection algorithm gives a weaker, additive approximation guarantee, but it works for any p∈[1,∞)p \in [1, \infty)p∈[1,∞).

View on arXiv
Comments on this paper