Abstract

In this paper we consider several algorithms for reducing the storage when using a quasi-Newton method in a dogleg--trust region setting for minimizing functions of many variables. Secant methods require O(n2) locations to store an approximate Hessian and O(n2) operations per iteration when minimizing a function of n variables. This storage requirement becomes impractical when n becomes large. Our algorithms use a BFGS update and require kn storage and $4kn + O(k2) operations per iteration, but they may require more iterations than the standard trust region techniques. Typically k is between 10 and 100. Our dogleg--trust region strategies involve expressions with matrix products with both the inverse of this Hessian and with the Hessian itself. Our techniques for updating expressions for the Hessian and its inverse can be used to improve the performance of line search, limited memory algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.