Abstract

This paper discusses recent developments on superlinear convergence results in nonsmooth optimization and nonsmooth equations. The concept of semismoothness has been shown to play a key role to obtain superlinear convergence for algorithms solving nonsmooth problems. Using semismoothness and nonsingularity conditions, generalized Newton methods have been proved locally and superlinearly convergent for solving systems of nonsmooth equations. Moreover, nonsmooth equation based generalized Newton methods have been successfully used for solving nonlinear complementarity and related problems. Various quasi-Newton methods have been proposed for different nonsmooth equations. But, more work on quasi-Newton methods is needed. The idea of generalized Newton methods was further applied to nonsmooth unconstrained and constrained optimization, for instance, LC1 problems and nonsmooth convex optimization.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call