Abstract

The preconditioned generalized shift-splitting (PGSS) iteration method is unconditionally convergent for solving saddle point problems with nonsymmetric coefficient matrices. By making use of the PGSS iteration as the inner solver for the Newton method, we establish a class of Newton-PGSS method for solving large sparse nonlinear system with nonsymmetric Jacobian matrices about saddle point problems. For the new presented method, we give the local convergence analysis and semilocal convergence analysis under Hölder condition, which is weaker than Lipschitz condition. In order to further raise the efficiency of the algorithm, we improve the method to obtain the modified Newton-PGSS and prove its local convergence. Furthermore, we compare our new methods with the Newton-RHSS method, which is a considerable method for solving large sparse nonlinear system with saddle point nonsymmetric Jacobian matrix, and the numerical results show the efficiency of our new method.

Highlights

  • In this paper, we will explore effective and convenient methods for solving nonlinear nonsymmetric saddle-point problem: F(x) 0, (1)where F: D ⊂ Rn ⟶ Rn is a continuous differentiable nonlinear with Fi function Fi(x), i and 1, 2, the ..., function and x. . , Fn+m)T xn+m)T is defined on an open convex subset of (n + m)-dimensional real linear space Rn+m

  • Under Assumption 1, we establish the local convergence theorem for the Newton-preconditioned generalized shift-splitting (PGSS), and we can know the properties of function F around the numerical solution x∗ and the information about the radius of the neighborhood. e properties and information mentioned above will affect the given method about the local convergence

  • RHSS and PGSS are treated as preprocessing operators, and the Krylov subspace method is used to solve the problem, which is better than the Krylov subspace method in CPU and step number

Read more

Summary

Introduction

We will explore effective and convenient methods for solving nonlinear nonsymmetric saddle-point problem: F(x) 0,. E researchers present the modified Newton iteration to improve convergence order as shown in Algorithm 2. E outer iteration is the Newton method, which is used to solve nonlinear problems, and each iteration has to solve a linear equation in order to generate the sequence 􏼈xk􏼉. A significant advantage of such inner-outer iterations is that one can reduce the inverse of the Jacobian matrix storage and calculation of each step, so as to improve the operation efficiency. Iteration should be employed to solve the Newton equation (4) with real nonsymmetric saddle-point Jacobian matrix. In order to increase the efficiency of algorithm, we optimized the outer iteration and we propose modified Newton-PGSS method to solve the saddle problems.

Preliminaries
The Newton-PGSS Method
Semilocal Convergence of the Newton-PGSS Method
The Modified Newton-PGSS Method and Its Local Convergence
Method
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call