Abstract

In this paper we propose efficient new linesearch algorithms for solving large scale unconstrained optimization problems which exploit any local nonconvexity of the objective function. Current algorithms in this class typically compute a pair of search directions at every iteration: a Newton-type direction, which ensures both global and fast asymptotic convergence, and a negative curvature direction, which enables the iterates to escape from the region of local non-convexity. A new point is generated by performing a search along a line or a curve obtained by combining these two directions. However, in almost all if these algorithms, the relative scaling of the directions is not taken into account. We propose a new algorithm which accounts for the relative scaling of the two directions. To do this, only the most promising of the two directions is selected at any given iteration, and a linesearch is performed along the chosen direction. The appropriate direction is selected by estimating the rate of decrease of the quadratic model of the objective function in both candidate directions. We prove global convergence to second-order critical points for the new algorithm, and report some preliminary numerical results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.