Abstract

Due to their remarkable convergence properties and performance in practice, conjugate gradient (CG) methods are widely used for solving unconstrained optimisation problems, especially those of large scale. From the 1950s until now, many studies have been carried out to propose new ones to improve existing CG methods. In this paper, we present a condition that guarantees the global convergence of CG methods when they are applied under the exact line search. At the same time, based on this condition, we did a minor modification on the CG methods of Polak-Rebiere-Polyak (PRP) and of Hestenes-Stiefel (HS) to propose new modified methods. Furthermore, to support the theoretical proof of the global convergence of the modified methods in practical computation, a numerical experiment based on comparing the proposed methods with other well-known CG methods was done. It has been found that the new modified methods have the fewest number of iterations and require the shortest time for solving the problems. In addition, they have the highest percentage of the test problems that solved successfully. Hence, we conclude that they can be used successfully for solving unconstrained optimisation problems. KEYWORDS Unconstrained optimisation problems; conjugate gradient methods; exact line search; global convergence

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call