Abstract

Due to their simplicity and global convergence properties, the conjugate gradient (CG) methods are widely used for solving unconstrained optimization problems, especially those of large scale. To establish the global convergence and to obtain better numerical performance in practice, much effort has been devoted to develop new CG methods or even to modify well- known methods. In 2012, Rivaie et al., have proposed a new CG method, called RMIL which has good numerical results and globally convergent under the exact line search. However, in 2016, Dai has pointed out a mistake in the steps of the proof of global convergence of RMIL and hence to guarantee the global convergence he suggested a modified version of RMIL, called RMIL+. In this paper, we present another modified version of RMIL, which is globally convergent via the exact line search. Furthermore, to support the theoretical proof of the global convergence of the modified version in practical computation, a numerical experiment based on comparing it with RMIL, RMIL+, and CG-DESCENT was done.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call