ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1509.09236
12
80

On the Complexity of Robust PCA and ℓ1\ell_1ℓ1​-norm Low-Rank Matrix Approximation

30 September 2015
Nicolas Gillis
S. Vavasis
    OOD
ArXivPDFHTML
Abstract

The low-rank matrix approximation problem with respect to the component-wise ℓ1\ell_1ℓ1​-norm (ℓ1\ell_1ℓ1​-LRA), which is closely related to robust principal component analysis (PCA), has become a very popular tool in data mining and machine learning. Robust PCA aims at recovering a low-rank matrix that was perturbed with sparse noise, with applications for example in foreground-background video separation. Although ℓ1\ell_1ℓ1​-LRA is strongly believed to be NP-hard, there is, to the best of our knowledge, no formal proof of this fact. In this paper, we prove that ℓ1\ell_1ℓ1​-LRA is NP-hard, already in the rank-one case, using a reduction from MAX CUT. Our derivations draw interesting connections between ℓ1\ell_1ℓ1​-LRA and several other well-known problems, namely, robust PCA, ℓ0\ell_0ℓ0​-LRA, binary matrix factorization, a particular densest bipartite subgraph problem, the computation of the cut norm of {−1,+1}\{-1,+1\}{−1,+1} matrices, and the discrete basis problem, which we all prove to be NP-hard.

View on arXiv
Comments on this paper