Abstract

Low-rankness has been widely observed in real world data and there is often a need to recover low-rank matrices in many machine learning and data mining problems. Robust principal component analysis (RPCA) has been used for such problems by separating the data into a low-rank and a sparse part. The convex approach to RPCA has been well studied due to its elegant properties in theory and many extensions have been developed. However, the state-of-the-art algorithms for the convex approach and their extensions are usually expensive in complexity due to the need for solving singular value decomposition (SVD) of large matrices. In this paper, we propose a novel RPCA model based on matrix tri-factorization, which only needs the computation of SVDs for very small matrices. Thus, this approach reduces the complexity of RPCA to be linear and makes it fully scalable. It also overcomes the drawback of the state-of-the-art scalable approach such as AltProj, which requires the precise knowledge of the true rank of the low-rank component. As a result, our method is about 4 times faster than AltProj. Our method can be used as a light-weight, scalable tool for RPCA in the absence of the precise value of the true rank.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.