Abstract

This paper deals with the preconditioning of truncated Newton methods for the solution of large scale nonlinear unconstrained optimization problems. We focus on preconditioners which can be naturally embedded in the framework of truncated Newton methods, i.e. which can be built without storing the Hessian matrix of the function to be minimized, but only based upon information on the Hessian obtained by the product of the Hessian matrix times a vector. In particular we propose a diagonal preconditioning which enjoys this feature and which enables us to examine the effect of diagonal scaling on truncated Newton methods. In fact, this new preconditioner carries out a scaling strategy and it is based on the concept of equilibration of the data in linear systems of equations. An extensive numerical testing has been performed showing that the diagonal preconditioning strategy proposed is very effective. In fact, on most problems considered, the resulting diagonal preconditioned truncated Newton method performs better than both the unpreconditioned method and the one using an automatic preconditioner based on limited memory quasi-Newton updating (PREQN) recently proposed by Morales and Nocedal [Morales, J.L. and Nocedal, J., 2000, Automatic preconditioning by limited memory quasi-Newton updating. SIAM Journal on Optimization, 10, 1079–1096].

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call