Abstract

A scaled memoryless BFGS preconditioned conjugate gradient algorithm for solving unconstrained optimization problems is presented. The basic idea is to combine the scaled memoryless BFGS method and the preconditioning technique in the frame of the conjugate gradient method. The preconditioner, which is also a scaled memoryless BFGS matrix, is reset when the Beale–Powell restart criterion holds. The parameter scaling the gradient is selected as the spectral gradient. In very mild conditions, it is shown that, for strongly convex functions, the algorithm is globally convergent. Computational results for a set consisting of 750 unconstrained optimization test problems show that this new scaled conjugate gradient algorithm substantially outperforms the known conjugate gradient methods including the spectral conjugate gradient by Birgin and Martínez [Birgin, E. and Martínez, J.M., 2001, A spectral conjugate gradient method for unconstrained optimization. Applied Mathematics and Optimization, 43, 117–128], the conjugate gradient by Polak and Ribière [Polak, E. and Ribière, G., 1969, Note sur la convergence de méthodes de directions conjuguées. Revue Francaise Informat. Reserche Opérationnelle, 16, 35–43], as well as the most recent conjugate gradient method with guaranteed descent by Hager and Zhang [Hager, W.W. and Zhang, H., 2005, A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM Journal on Optimization, 16, 170–192; Hager, W.W. and Zhang, H., 2004, CG-DESCENT, A conjugate gradient method with guaranteed descent ACM Transactions on Mathematical Software, 32, 113–137].

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call