Abstract

In this paper, we concentrate on the problem of low-rank matrix factorization (LRMF) in the presence of outliers and missing observations. This problem is paramount in practice as a variety of applications, such as signal processing, wireless communications, and machine learning, fit such an LRMF paradigm. We introduces a general LRMF formulation of minimizing the outlier robust <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$\ell_{p}$</tex> -based fitting term plus structure-promoting constraints, including nonnegativity and/or sparsity. To address this computationally intractable problem, we develop an algorithmic framework, named block iteratively reweighted least-squares (BIRLS). Each iteration of BIRLS is obtained by minimizing a regularized reweighted <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$\ell_{2}$</tex> -based problem with a closed-form solution in an interweaved block update fashion. Meanwhile, we prove that the whole iterative sequence of BIRLS is convergent and converges to a stationary point of the original problem. Numerical experiments demonstrate the superiority of the proposed algorithm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.