Abstract

Randomized block coordinate descent type methods have been demonstrated to be efficient for solving large-scale optimization problems. Linear convergence to the unique solution is established if the objective function is strongly convex. In this paper we propose a randomized block coordinate descent algorithm for solving the matrix least squares problem minX∈Rm×n‖C − AXB‖F with A∈Rp×m, B∈Rn×q, and C∈Rp×q. We prove that the proposed algorithm converges linearly to the unique minimum norm least squares solution (i.e., A†CB†) without the strong convexity assumption. Instead, we only need that B has full row rank. Numerical experiments are given to illustrate the theoretical results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call