Abstract

The curvature-aided IAG (CIAG) algorithm is an efficient asynchronous optimization method that accelerates IAG using a delay compensation technique. However, existing step-size rules for CIAG are conservative and hard to implement, and the Hessian computation in CIAG is often computationally expensive. To alleviate these issues, we first provide an easy-to-implement and less conservative step-size rule for CIAG. Next, we propose a modified CIAG algorithm that reduces the computational complexity by approximating the Hessian with a constant matrix. Convergence results are derived for each algorithm on both convex and strongly convex problems, and numerical experiments on logistic regression demonstrate their effectiveness in practice.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call