Abstract

For large-scale unconstrained optimization problems and nonlinear equations, we propose a new three-term conjugate gradient algorithm under the Yuan–Wei–Lu line search technique. It combines the steepest descent method with the famous conjugate gradient algorithm, which utilizes both the relevant function trait and the current point feature. It possesses the following properties: (i) the search direction has a sufficient descent feature and a trust region trait, and (ii) the proposed algorithm globally converges. Numerical results prove that the proposed algorithm is perfect compared with other similar optimization algorithms.

Highlights

  • It is well known that the model of small- and medium-scale smooth functions is simple since it has many optimization algorithms, such as Newton, quasi-Newton, and bundle algorithms

  • The conjugate gradient algorithm is mostly applied to smooth optimization problems, and in this paper, we propose a modified LS conjugate gradient algorithm to solve large-scale nonlinear equations and smooth problems

  • It is well known that the PRP algorithm is efficient but has shortcomings, as it does not possess global convergence under the weak Wolfe–Powell (WWP) line search technique

Read more

Summary

Introduction

It is well known that the model of small- and medium-scale smooth functions is simple since it has many optimization algorithms, such as Newton, quasi-Newton, and bundle algorithms. It is well known that the PRP algorithm is efficient but has shortcomings, as it does not possess global convergence under the WWP line search technique To solve this complex problem, Yuan, Wei, and Lu [43] developed the following creative formula (YWL) for the normal WWP line search technique and obtained many fruitful theories:. In light of the previous work by experts on the conjugate gradient algorithm, a sufficient descent feature is necessary for the global convergence. We express a new conjugate gradient algorithm under the YWL line search technique as follows:. Important characteristics This section lists some important properties of sufficient descent, the trust region, and the global convergence of Algorithm 2.1 Numerical results we list the numerical result in terms of the algorithm characteristics NI, NFG, and CPU, where NI is the total iteration number, NFG is the sum of the calculation frequency of the objective function and gradient function, and CPU is the calculation time in seconds

Problems and test experiments
A Quadratic Function QF2 Function
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call