Abstract

A class of two–parameter scaled memoryless BFGS methods is developed for solving unconstrained optimization problems. Then, the scaling parameters are determined in a way to improve the condition number of the corresponding memoryless BFGS update. It is shown that for uniformly convex objective functions, search directions of the method satisfy the sufficient descent condition which leads to the global convergence. To achieve convergence for general functions, a revised version of the method is developed based on the Li–Fukushima modified secant equation. To enhance performance of the methods, a nonmonotone scheme for computing the initial value of the step length is suggested to be used in the line search procedure. Numerical experiments are done on a set of unconstrained optimization test problems of the CUTEr collection. They show efficiency of the proposed algorithms in the sense of the Dolan–More performance profile.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call