We consider the problem of subset selection for subspace approximation, that is, to efficiently find a \emph{small} subset of data points such that solving the problem optimally for this subset gives a good approximation to solving the problem optimally for the original input. Previously known subset selection algorithms based on volume sampling and adaptive sampling \cite{DeshpandeV07}, for the general case of , require multiple passes over the data. In this paper, we give a one-pass subset selection with an additive approximation guarantee for subspace approximation, for any . Earlier subset selection algorithms that give a one-pass multiplicative approximation work under the special cases. Cohen \textit{et al.} \cite{CohenMM17} gives a one-pass subset section that offers multiplicative approximation guarantee for the special case of subspace approximation. Mahabadi \textit{et al.} \cite{MahabadiRWZ20} gives a one-pass \emph{noisy} subset selection with approximation guarantee for subspace approximation when . Our subset selection algorithm gives a weaker, additive approximation guarantee, but it works for any .
View on arXiv