Abstract

A block version of the Broyden–Fletcher–Goldfarb–Shanno (BFGS) variable metric update formula and its modifications are investigated. In spite of the fact that this formula satisfies the quasi-Newton conditions with all used difference vectors and that the improvement of convergence is the best one in some sense for quadratic objective functions, for general functions, it does not guarantee that the corresponding direction vectors are descent directions. To overcome this difficulty, but at the same time utilize the advantageous properties of the block BFGS update, a block version of the limited-memory variable metric BNS method for large-scale unconstrained optimization is proposed. The global convergence of the algorithm is established for convex sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call