Abstract
This work presents a double step size algorithm with accelerated property for solving nonlinear unconstrained optimization problems. Using the inexact line search technique, as well as the approximation of the Hessian by an adequate diagonal matrix, an efficient accelerated gradient descent method is developed. The proposed method is proven to be linearly convergent for uniformly convex functions and also, under some specific conditions, linearly convergent for strictly convex quadratic functions. Numerical testings and comparisons show that constructed scheme exceeds some known iterations for unconstrained optimization with respect to all three tested properties: number of iterations, CPU time and number of function evaluations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.