Abstract

The paper describes new conjugate gradient algorithms for large scale nonconvex problems with box constraints. In order to speed up convergence the algorithms employ scaling matrices which transform the space of original variables into the space in which Hessian matrices of the problem’s functionals have more clustered eigenvalues. This is done by applying limited memory BFGS updating matrices. Once the scaling matrix is calculated, the next few conjugate gradient iterations are performed in the transformed space. The box constraints are treated efficiently by the projection. We also present a limited memory quasi-Newton method which is a special version of our general algorithm. The presented algorithms have strong global convergence properties, in particular they identify constraints active at a solution in a finite number of iterations. We believe that they are competitive to the L-BFGS-B method and present some numerical results which support our claim.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call