Abstract

This paper presents a modification of the BFGS-method for unconstrained minimization that avoids computation of derivatives. The gradients are approximated by the aid of differences of function values. These approximations are calculated in such a way that a complete convergence proof can be given. The presented algorithm is implementable, no exact line search is required. It is shown that, if the objective function is convex and some usually required conditions hold, the algorithm converges to a solution. If the Hessian matrix of the objective function is positive definite and satisfies a Lipschitz-condition in a neighbourhood of the solution, then the rate of convergence is superlinear.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call