The low-rank matrix approximation problem with respect to the component-wise -norm (-LRA), which is closely related to robust principal component analysis (PCA), has become a very popular tool in data mining and machine learning. Robust PCA aims at recovering a low-rank matrix that was perturbed with sparse noise, with applications for example in foreground-background video separation. Although -LRA is strongly believed to be NP-hard, there is, to the best of our knowledge, no formal proof of this fact. In this paper, we prove that -LRA is NP-hard, already in the rank-one case, using a reduction from MAX CUT. Our derivations draw interesting connections between -LRA and several other well-known problems, namely, robust PCA, -LRA, binary matrix factorization, a particular densest bipartite subgraph problem, the computation of the cut norm of matrices, and the discrete basis problem, which we all prove to be NP-hard.
View on arXiv