Abstract

In recent years, matrix rank minimization problems have received a significant amount of attention in machine learning, data mining and computer vision communities. And these problems can be solved by a convex relaxation of the rank minimization problem which minimizes the nuclear norm instead of the rank of the matrix, and has to be solved iteratively and involves singular value decomposition (SVD) at each iteration. Therefore, those algorithms for nuclear norm minimization problems suffer from high computation cost of multiple SVDs. In this paper, we propose a Fast Tri-Factorization (FTF) method to approximate the nuclear norm minimization problem and mitigate the computation cost of performing SVDs. The proposed FTF method can be used to reliably solve a wide range of low-rank matrix recovery and completion problems such as robust principal component analysis (RPCA), low-rank representation (LRR) and low-rank matrix completion (MC). We also present three specific models for RPCA, LRR and MC problems, respectively. Moreover, we develop two alternating direction method (ADM) based iterative algorithms for solving the above three problems. Experimental results on a variety of synthetic and real-world data sets validate the efficiency, robustness and effectiveness of our FTF method comparing with the state-of-the-art nuclear norm minimization algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.