Abstract

The Jacobi, Gauss-Seidel and SOR methods belong to the class of simple iterative methods for linear systems. Because of the parameter , the SOR method is more effective than the Gauss-Seidel method. Here, a new approach to the simple iterative methods is proposed. A new parameter q can be introduced to every simple iterative method. Then, if a matrix of a system is positive definite and the parameter q is sufficiently large, the method is convergent. The original Jacobi method is convergent only if the matrix is diagonally dominated, while the Jacobi method with the parameter q is convergent for every positive definite matrix. The optimality criterion for the choice of the parameter q is given, and thus, interesting results for the Jacobi, Richardson and Gauss-Seidel methods are obtained. The Gauss-Seidel method with the parameter q, in a sense, is equivalent to the SOR method. From the formula for the optimal value of q results the formula for optimal value of . Up to present, this formula was known only in special cases. Practical useful approximate formula for optimal value  is also given. The influence of the parameter q on the speed of convergence of the simple iterative methods is shown in a numerical example. Numerical experiments confirm: for very large scale systems the speed of convergence of the SOR method with optimal or approximate parameter  is near the same (in some cases better) as the speed of convergence of the conjugate gradients method.

Highlights

  • The solution of linear systems is a fundamental problem in numerical analysis, especially, if the system has a very large scale

  • We return to the classical simple iterative methods, such as the following: the Jacobi method, the Richardson method, the Gauss-Seidel method and the Successive Over-Relaxation (SOR) method

  • There are many results concerning the optimal parameter for the SOR method

Read more

Summary

Introduction

The solution of linear systems is a fundamental problem in numerical analysis, especially, if the system has a very large scale. In the papers [6, 9, 10] you can find exact formula for optimal parameter for some discretization of Poisson equation We give such formula for every positive definite and symmetric matrix. The simple iterative methods are defined as Stanislaw Marian Grzegorski: On Optimal Parameter Not Only for the SOR Method follows:. In this case we obtain the Jacobi method with the parameter % and, for sufficiently large %, the method is convergent for a diagonally dominated, and for every positive definite matrix;. Plays an important role in the analysis of the speed of convergence of some of the iterative methods. The results concerning the Gauss-Seidel and SOR methods can be found, and formulas for the optimal values of % and are given. A comparison of the speed of convergence for some of the iterative methods is made

Convergence of the Method
The Richardson Method
The Gauss-Seidel Method with the Parameter q
Numerical Example
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.