Abstract

In this paper, we introduce a nonlinear scaled conjugate gradient method, operating on the premise of a descent and conjugacy relationship. The proposed algorithm employs a conjugacy parameter that is determined to ensure that the method generates conjugate directions. It also utilizes a parameter that scales the gradient to enhance the convergence behavior of the method. The derived method not only exhibits the crucial feature of global convergence but also maintains the generation of descent directions. The efficiency of the method is established through numerical tests conducted on a variety of high-dimensional nonlinear test functions. The obtained results attest to the improved behavior of the derived algorithm and support the presented theory.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call