Abstract
This paper introduces a new class of quasi-Newton algorithms for unconstrained minimization in which no line search is necessary and the inverse Hessian approximations are positive definite. These algorithms are based on a two-parameter family of rank two, updating formulae used earlier with line search in self-scaling variable metric algorithms. It is proved that, in a quadratic case, the new algorithms converge at least weak superlinearly. A special case of the above algorithms was implemented and tested numerically on several test functions. In this implementation, however, cubic interpolation was performed whenever the objective function was not satisfactorily decreased on the first "shot" (with unit step size), but this did not occur too often, except for very difficult functions. The numerical results indicate that the new algorithm is competitive and often superior to previous methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.