Abstract

In this paper, we present a modified regularized Newton method for minimizing a nonconvex function whose Hessian matrix may be singular. We show that if the gradient and Hessian of the objective function are Lipschitz continuous, then the method has a global convergence property. Under the local error bound condition which is weaker than nonsingularity, the method has cubic convergence.

Highlights

  • IntroductionX∈Rn where f : Rn → R is twice continuously differentiable, whose gradient ∇ f and Hessian ∇2 f are denoted by g(x) and H(x) respectively

  • We consider the unconstrained optimization problem min f (x), (1)x∈Rn where f : Rn → R is twice continuously differentiable, whose gradient ∇ f and Hessian ∇2 f are denoted by g(x) and H(x) respectively

  • We present a modified regularized Newton method for minimizing a nonconvex function whose Hessian matrix may be singular

Read more

Summary

Introduction

X∈Rn where f : Rn → R is twice continuously differentiable, whose gradient ∇ f and Hessian ∇2 f are denoted by g(x) and H(x) respectively. If Hk is Lipschitz continuous and nonsingular at the solution, the Newton method has quadratic convergence. Sun, 1999) proposed a regularized Newton method, where the trial step is the solution of the linear equations (Hk + μkI)d = −gk,. H. Li, 2004) chose μk = ∥gk∥2 and showed that the regularized Newton method has quadratic convergence under the local error bound condition which is weaker than nonsingularity. Newton and inexact Newton methods are possible extended to nonconvex minimization problems (Dong-huiLi, 2004) The main scheme of the modified regularized Newton method for unconstrained nonconvex optimization is given as follows. At every iteration, it solves the linear equations (Hk + μkI)d = −gk (7).

The Algorithm and Global Convergence
Concluding Remarks

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.