Abstract

$$l_{q}$$ ( $$0< q < 1$$ ) regularization is a dominating strategy for matrix completion problems. The main goal of nonconvex $$l_{q}$$ regularization based algorithm is to find a so-called low-rank solution.Unfortunately, most existing algorithms suffer from full singular value decomposition (SVD), and thus become inefficient for large-scale matrix completion problems. To alleviate this limitation, in this paper we propose an accelerated inexact algorithm to handle such problem. The key idea is to employ the closed-form q-thresholding ( $$q = 1/2, 2/3$$ ) operator to approximate the rank of a matrix. The power method and the special “sparse plus low-rank” structure of the matrix iterates are adopted to allow efficient SVD. Besides, we employ Nesterov’s accelerated gradient method and continuation technique to further accelerate the convergence speed of our proposed algorithm. A convergence analysis shows that the sequence $$\{X_{t}\}$$ generated by our proposed algorithm is bounded and has at least one accumulation point. Extensive experiments have been conducted to study its recovery performance on synthetic data, image recovery and recommendation problems. All results demonstrate that our proposed algorithm is able to achieve comparable recovery performance, while being faster and more efficient than state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call