Abstract

This paper is concerned with the structured simultaneous low-rank and sparse recovery, which can be formulated as the rank and zero-norm regularized least squares problem with a hard constraint \(\mathrm{diag}(\Theta )=0\). For this class of NP-hard problems, we propose a convex relaxation algorithm by applying the accelerated proximal gradient method to a convex relaxation model, which is yielded by the smoothed nuclear norm and the weighted \(l_1\)-norm regularized least squares problem. A theoretical guarantee is provided by establishing the error bounds of the iterates to the true solution under mild restricted strong convexity conditions. To the best of our knowledge, this work is the first one to characterize the error bound of the iterates of the algorithm to the true solution. Finally, numerical results are reported for some random test problems and synthetic data in subspace clustering to verify the efficiency of the proposed convex relaxation algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call