Abstract

In this thesis we study the problem of minimizing nonlinear functions of several variables, where the objective function is continuously differentiable on an open subset of Rn. We develop mathematical optimization methods for solving large scale problems, i.e., problems whose variables are many thousands, even millions. The proposed method is based on the theoretical study of the properties of minimal and low memory Quasi-Newton updates. We establish theorems concerning the characteristic polynomial, the number of distinct eigenvalues and corresponding eigenvectors. We derive closed formulas for calculating these quantities, avoiding both the storage and factorization of matrices. The new theoretical results are applied in the large scale trust region subproblem for calculating nearly exact solutions as well as in a curvilinear search that uses a Quasi-Newton and a negative curvature direction. The new method is drastically reducing the spatial complexity of known algorithms of nonlinear programming. As a result, the new algorithms have spatial complexity Θ(n), while they are maintaining good convergence properties. The numerical results show that the proposed algorithms are efficient, fast and very effective when used in solving large scale problems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.