Abstract
Randomized block coordinate descent type methods have been demonstrated to be efficient for solving large-scale optimization problems. Linear convergence to the unique solution is established if the objective function is strongly convex. In this paper we propose a randomized block coordinate descent algorithm for solving the matrix least squares problem minX∈Rm×n‖C − AXB‖F with A∈Rp×m, B∈Rn×q, and C∈Rp×q. We prove that the proposed algorithm converges linearly to the unique minimum norm least squares solution (i.e., A†CB†) without the strong convexity assumption. Instead, we only need that B has full row rank. Numerical experiments are given to illustrate the theoretical results.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.