Abstract

In this paper, a modified BFGS algorithm is proposed for unconstrained optimization. The proposed algorithm has the following properties: (i) a nonmonotone line search technique is used to obtain the step size alpha_{k} to improve the effectiveness of the algorithm; (ii) the algorithm possesses not only global convergence but also superlinear convergence for generally convex functions; (iii) the algorithm produces better numerical results than those of the normal BFGS method.

Highlights

  • Consider min f (x)|x ∈ n, ( . )where f (x) : n → is continuously differentiable

  • Under inexact line search techniques, Dai [ ] constructed an example to show that the BFGS method fails

  • Its global convergence can be found in [ ], but the method fails for general convex functions

Read more

Summary

Introduction

Consider min f (x)|x ∈ n , ( . )where f (x) : n → is continuously differentiable. Many similar problems can be transformed into the above optimization problem (see [ – ] etc.). Under inexact line search techniques, Dai [ ] constructed an example to show that the BFGS method fails. To obtain global convergence of a BFGS method without the convexity assumption, Li and Fukushima [ , ] proposed the following modified BFGS methods. Under the WWP line search, Wei et al [ ] proposed the quasi-Newton method and established its superlinear convergence for uniformly convex functions.

Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.