Abstract

Nonlinear conjugate gradient (CG) methods are significant for solving large-scale, unconstrained optimization problems, providing vital knowledge to determine the minimum point or optimize the objective functions. Many studies of modifications for nonlinear CG methods have been carried out to improve the performance of numerical computation and to establish global convergence properties. One of these studies is the modified CG method, which has been proposed by Rivaie et al. (2015). In this paper, we modify their work in such a way that one can obtain efficient numerical performance and global convergence properties. Due to the widespread use of the strong Wolfe line search in practice, our proposed modified method implemented its use. At the same time, to show the performance of the modified method in practice, a numerical experiment is performed. KEYWORDS Unconstrained optimization; conjugate gradient method; sufficient descent property; global convergence

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call