Abstract

The cyclic (block) coordinate gradient descent method is an optimization method that has attracted much interest in applications of applied mathematics, statistics, and engineering. Reasons for this include its simplicity, speed, and stability, as well as its competitive performance on separable nonsmooth convex minimization problems, whose objective function is the sum of a smooth function and a separable (and possibly nonsmooth) convex function, such as the $\ell_1$-regularized linear least squares problem and the $\ell_1$-regularized logistic regression problem. But very little is known about the worst case iteration complexity of the method for solving the separable nonsmooth convex minimization problem. We prove that the method terminates in $O(1/\epsilon)$ iterations with an $\epsilon$-optimal solution, or equivalently, the convergence rate for it is $O(1/k)$, where $k$ is the iteration counter, when the smooth function of the objective has a Lipschitz gradient. Also, the linear rate convergence of the method is proved when the objective is a strongly convex function having a Lipschitz gradient or when the smooth function of the objective is a composition of a strong convex function having a Lipschitz gradient with a linear function, the convex function of the objective is polyhedral, and there is a real number whose corresponding level set of the convex function contains the set of optimal solutions and is bounded.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call