Abstract

We present an efficient one-parameter family of conjugate gradient methods for unconstrained optimization problems. These methods are defined using a combination of the Polak-Ribiere-Polyak method and the Rivaie-Mustafa-Ismail-Leong method. We prove the global convergence based on the Wolfe line search for nonlinear objective functions. Finally, we give some numerical experiments, proving the efficiency of the proposed approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call