Abstract
This paper considers employing extra updates for the BFGS method for unconstrained optimization. The usual BFGS Hessian is updated a number of times, depending on the information of the first order derivatives, to obtain a new Hessian approximation at each iteration. Two approaches are proposed. One of them has the same properties of global and superlinear convergence on convex functions the BFGS method has, and another has the same property of quadratic termination without exact line searches that the symmetric rank-one method has. The new algorithms attempt to combine the best features of certain methods which are intended for either parallel computation or large scale optimization. It is concluded that some new algorithms are competitive with the standard BFGS method
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.