In this paper we propose a modified regularized Newton method for convex minimization problems whose Hessian matrices may be singular. The proposed method is proved to converge globally if the gradient and Hessian of the objective function are Lipschitz continuous. Under the local error bound condition, we first show that the method converges quadratically, which implies that ‖xk−x∗‖ is equivalent to dist(xk,X), where X is the solution set and xk→x∗∈X. Then we in turn prove the cubic convergence of the proposed method under the same local error bound condition, which is weaker than nonsingularity.
Read full abstract