Abstract

The paper introduces a family of algorithms for unconstrained minimization of an nvariable twice continuously differentiate function /. Unlike classical met hods, which improve a current solution by moving along a straight line, the new methods improve the solution by moving along a quadratic curve in R nThe specific curve is determined by minimizing an appropriate model of f The algorithms thus obtained (called Curved Searchalgorithms) all possess a global convergence property combined with a quadratic rate of convergence. They are all using the same information and employing the same computational effort as the Newton method, which is in fact a member of this class, versions of cured search methods with inexact line search of the Goldstein-type are studied as well, retaining the above desirable convergence properties. We also discuss a version, called β-method not requiring a line search altogether. Computational experience reported in the paper points out to the potential improvement, which may be gain...

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call