Abstract

Large-scale optimization algorithms frequently require sparse Hessian matrices that are not readily available. Existing methods for approximating large sparse Hessian matrices have limitations. To try and overcome these, we propose a novel approach that reformulates the problem as the solution of a large linear least squares problem. The least squares problem is sparse but can include a number of rows that contain significantly more entries than other rows and are regarded as dense. We exploit recent work on solving such problems using either the normal equations or an augmented system to derive a robust approach for computing approximate sparse Hessian matrices. Example sparse Hessians from the CUTEst test problem collection for optimization illustrate the effectiveness and robustness of the new method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.