Abstract
Conjugate gradient methods play an important role in many fields of application due to their simplicity, low memory requirements, and global convergence properties. In this paper, we propose an efficient three-term conjugate gradient method by utilizing the DFP update for the inverse Hessian approximation which satisfies both the sufficient descent and the conjugacy conditions. The basic philosophy is that the DFP update is restarted with a multiple of the identity matrix in every iteration. An acceleration scheme is incorporated in the proposed method to enhance the reduction in function value. Numerical results from an implementation of the proposed method on some standard unconstrained optimization problem show that the proposed method is promising and exhibits a superior numerical performance in comparison with other well-known conjugate gradient methods.
Highlights
In this paper, we are interested in solving nonlinear large scale unconstrained optimization problems of the form min f (x), x ∈ n, ( )where f : n → is an at least twice continuously differentiable function
Andrei [ ] considers the development of a three-term conjugate gradient method from the BFGS updating scheme of the inverse Hessian approximation restarted as an identity matrix at every iteration where the search direction is given by dk+
We proposed our three-term conjugate gradient method by incorporating the DFP updating scheme of the inverse Hessian approximation ( ), within the frame of a memoryless quasiNewton method where at each iteration the inverse Hessian approximation is restarted as a multiple of the identity matrix with a positive scaling parameter as sk sTk sTk yk μk yk yTk yTk yk and the search direction is given by dk+
Summary
We are interested in solving nonlinear large scale unconstrained optimization problems of the form min f (x), x ∈ n, ( )where f : n → is an at least twice continuously differentiable function. An attractive property of these methods is that at each iteration, the search direction satisfies the descent condition, namely gkT dk = –c gk for some constant c > . Andrei [ ] considers the development of a three-term conjugate gradient method from the BFGS updating scheme of the inverse Hessian approximation restarted as an identity matrix at every iteration where the search direction is given by dk+
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.