Abstract

The Kaczmarz and Gauss--Seidel methods aim to solve an $m \times n$ linear system $X{\beta} = {y}$ by iteratively refining the solution estimate; the former uses random rows of $X$ to update ${\beta}$ given the corresponding equations and the latter uses random columns of $X$ to update corresponding coordinates in ${\beta}$. Recent work analyzed these algorithms in a parallel comparison for the overcomplete and undercomplete systems, showing convergence to the ordinary least squares (OLS) solution and the minimum Euclidean norm solution, respectively. This paper considers the natural follow-up to the OLS problem---ridge or Tikhonov regularized regression. By viewing them as variants of randomized coordinate descent, we present variants of the randomized Kaczmarz (RK) and randomized Gauss--Siedel (RGS) for solving this system and derive their convergence rates. We prove that a recent proposal, which can be interpreted as randomizing over both rows and columns, is strictly suboptimal---instead, one should a...

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call