Abstract

Many optimization algorithms generate, at each iteration, a pair $( x_k ,H_k )$ consisting of an approximation to the solution $x_k $ and a Hessian matrix approximation $H_k $ that contains local second-order information about the problem. Much is known about the convergence of $x_k $ to the solution of the problem, but relatively little is known about the behavior of the sequence of matrix approximations. The sequence $\{ H_k \}$, generated by the extended Broyden class of updating schemes independently of the optimization setting in which they are used, is analyzed. Various conditions under which convergence is assured are derived, and the structure of the limits is delineated. Rates of convergence are also obtained. These results extend and clarify those already in the literature.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.