Abstract

This work presents a double step size algorithm with accelerated property for solving nonlinear unconstrained optimization problems. Using the inexact line search technique, as well as the approximation of the Hessian by an adequate diagonal matrix, an efficient accelerated gradient descent method is developed. The proposed method is proven to be linearly convergent for uniformly convex functions and also, under some specific conditions, linearly convergent for strictly convex quadratic functions. Numerical testings and comparisons show that constructed scheme exceeds some known iterations for unconstrained optimization with respect to all three tested properties: number of iterations, CPU time and number of function evaluations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call