Abstract

Robust principal component analysis (RPCA) is devoted to tackling grossly corrupted datasets with noise. However, the performance of RPCA is usually circumscribed by the lack of efficiency of singular value decomposition (SVD), which rules out its potential applications to many large-scale real-world problems. In this paper, we develop a nonconvex optimization algorithm customized to SVD-free RPCA models. The proposed algorithm, which is built upon proximal alternating linearized minimization Bolte et al. [Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Math Program. 2014;146(1–2):459–494], can reduce computational efforts by partially linearizing data fidelity and increase efficiency by leveraging inertial techniques. Under the Kurdyka-Łojasiewicz assumption on the objective function and some mild premises on stepsizes, the sequence produced by the proposed algorithm converges globally to a critical point of SVD-free RPCA models. Numerical simulations on synthetic and real datasets demonstrate the compelling performance of the proposed algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call