Abstract

Block coordinate descent method enjoys a long history and is a classic optimization method which has been extensively applied because it possesses some advantages, such as the simplicity, speed, and stability, etc. We prove that it can be accelerated and obtain an $O(\\frac{1}{k^2})$ rate of convergence for unconstrained convex optimization problem under a relatively mild assumption that the objective function is a convex function with block coordinate strong convexity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call