Abstract

We develop a framework employing scaling functions for the construction of multistep quasi-Newton methods for unconstrained optimization. These methods utilize values of the objective function. They are constructed via interpolants of the m+1 most recent iterates/gradient evaluations, and possess a free parameter which introduces an additional degree of flexibility. This permits the interpolating functions to assimilate information, in the form of function-values, which is readily available at each iteration. Motivated by previous experience [1] with the use of function-values in multistep methods, we investigate the incorporation of this information in the construction of the Hessian approximation at each iteration, in an attempt to accelerate convergence. We concentrate on a specific example from the general family of methods, corresponding to a particular choice of the scaling function, and from it derive three new algorithms. The relative numerical performance of these methods is assessed, and the most successful of them is then compared with the standard BFGS method and with an earlier algorithm utilizing function-values, also developed by the authors [1].

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.