Abstract

Given the ultrahigh dimensionality and the complex structure, which contains matrices and vectors, the mixed matrix minimization becomes crucial for the analysis of those data. Recently, the nonconvex functions such as the smoothly clipped absolute deviation, the minimax concave penalty, the capped \begin{document}$ \ell_1 $\end{document} -norm penalty and the \begin{document}$ \ell_p $\end{document} quasi-norm with \begin{document}$ 0 have been shown remarkable advantages in variable selection due to the fact that they can overcome the over-penalization. In this paper, we propose and study a novel nonconvex mixed matrix minimization, which combines the low-rank and sparse regularzations and nonconvex functions perfectly. The augmented Lagrangian method (ALM) is proposed to solve the noncovnex mixed matrix minimization problem. The resulting subproblems either have closed-form solutions or can be solved by fast solvers, which makes the ALM particularly efficient. In theory, we prove that the sequence generated by the ALM converges to a stationary point when the penalty parameter is above a computable threshold. Extensive numerical experiments illustrate that our proposed nonconvex mixed matrix minimization model outperforms the existing ones.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call