Abstract
We introduce a hybrid gradient model for solving unconstrained optimization problems based on one specific accelerated gradient iteration. Having applied a three term hybridization relation on transformed accelerated double step size model, we develop an efficient hybrid accelerated scheme. We determine an iterative step size variable using Backtracking line search technique in which we take an optimally calculated starting value for the posed method. In convergence analysis, we show that the proposed method is at least linearly convergent on the sets of uniformly convex functions and strictly convex quadratic functions. Numerical computations confirm a significant improvement compared with some relevant hybrid and accelerated gradient processes. More precisely, subject to the number of iterations, the CPU time metric and the number of evaluations of the objective function, defined process outperforms comparative schemes multiple times.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.