Abstract

Abstract We extend the regularized Hermitian and skew-Hermitian splitting (RHSS) iteration methods for standard saddle-point problems to stabilized saddle-point problems and establish the corresponding unconditional convergence theory for the resulting methods. Besides being used as stationary iterative solvers, this class of RHSS methods can also be used as preconditioners for Krylov subspace methods. It is shown that the eigenvalues of the corresponding preconditioned matrix are clustered at a small number of points in the interval $(0, \, 2)$ when the iteration parameter is close to $0$ and, furthermore, they can be clustered near $0$ and $2$ when the regularization matrix is appropriately chosen. Numerical results on stabilized saddle-point problems arising from finite element discretizations of an optimal boundary control problem and of a Cahn–Hilliard image inpainting problem, as well as from the Gauss–Newton linearization of a nonlinear image restoration problem, show that the RHSS iteration method significantly outperforms the Hermitian and skew-Hermitian splitting iteration method in iteration counts and computing times when they are used either as linear iterative solvers or as matrix splitting preconditioners for Krylov subspace methods, and optimal convergence behavior can be achieved when using inexact variants of the proposed RHSS preconditioners.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call