Abstract

The low-rank matrix approximation problem with respect to the component-wise ℓ1-norm (ℓ1-LRA), which is closely related to robust principal component analysis (PCA), has become a very popular tool in data mining and machine learning. Robust PCA aims to recover a low-rank matrix that was perturbed with sparse noise, with applications for example in foreground-background video separation. Although ℓ1-LRA is strongly believed to be NP-hard, there is, to our knowledge, no formal proof of this fact. In this paper, we prove that ℓ1-LRA is NP-hard, already in the rank-one case, using a reduction from MAX CUT. Our derivations draw interesting connections between ℓ1-LRA and several other well-known problems, i.e., robust PCA, ℓ0-LRA, binary matrix factorization, a particular densest bipartite subgraph problem, the computation of the cut norm of {−1, + 1} matrices, and the discrete basis problem, all of which we prove to be NP-hard.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call