Abstract

Many methods for solving minimization problems are variants of Newton method, which requires the specification of the Hessian matrix of second derivatives. Quasi-Newton methods are intended for the situation where the Hessian is expensive or difficult to calculate. Quasi-Newton methods use only first derivatives to build an approximate Hessian over a number of iterations. This approximation is updated each iteration by a matrix of low rank. In unconstrained minimization, the original quasi-Newton equation is B k+1 s k = y k , where y k is the difference of the gradients at the last two iterates. In this paper, we first propose a new quasi-Newton equation B k + 1 s k = y k ∗ in which y k ∗ is decided by the sum of y k and A k s k where A k is some matrix. Then we give two choices of A k which carry some second order information from the Hessian of the objective function. The three corresponding BFGS-TYPE algorithms are proved to possess global convergence property. The superlinear convergence of the one algorithm is proved. Extensive numerical experiments have been conducted which show that the proposed algorithms are very encouraging.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call