Abstract

AbstractOur investigations into solving systems of liner equations thus far have centered on the theoretical aspects of variants of the successive overrelaxation iterative method and the alternating-direction implicit methods. In all cases, the theoretical selection of associated optimum acceleration parameters was based on knowledge of certain key eigenvalues. Specifically, only the sepctral radius ρ(B) of the block Jacobi matrix B is needed (Theorem 4.7) to determine the optimum relaxation factor $$ \omega _b = \frac{2}{{1 + \sqrt {1 - \rho ^2 \left( B \right)} }} $$ (9.1) when the successive overrelaxation iterative method is applied in cases where B is a consistently ordered weakly cyclic matrix (of index 2) with real eigenvalues. Similarly, when the block Jacobi matrix has the special form $$ B = \left[ {\begin{array}{rll} O &\vline & F \\ \hline {F^* } &\vline & O \\ \end{array}} \right], $$ (9.2) the cyclic Chenbyshev semi-iterative method of Sect. 5.3 makes use of acceleration parameters ωm defined in (5.24) by $$ \begin{array}{*{20}c} {\omega _m = \frac{{2C_{m - 1} \left( {1/\rho \left( B \right)} \right)}}{{\rho \left( B \right)C_m \left( {1/\rho \left( B \right)} \right)}},} & {m > 1,} & {\omega _1 = 1,} \\ \end{array} $$ (9.3) where C m (x) is the Chebyshev polynomial of degree m. Again, the determination of optimum parameters depends only on the knowledge of ρ(B). The situation regarding variants of the alternating-direction implicit methods is not essentially different. In the commutative case H 1 V 1 = V 1 H 1 of Sect. 7.2, only the knowledge of the bounds α and β, in (7.37), $$ \begin{array}{*{20}c} {0 < \alpha \le \sigma _j ,\tau _j \le \beta ,} & {1 \le j \le n,} \\ \end{array} $$ of the eigenvalues σj and τj of H 1 and V 1 was essential to the theoretical problem of selecting optimum acceleration parameters \( \tilde r_j \). However, to dismiss the practical problem of estimating key parameters, because in actual computations one needs but one or two eigenvalue estimates, would be overly optimistic. Those who have had even limited experience in solving such matrix problems by iterative methods have learned (often painfully) that small changes in the estimates of these key eigenvalues can sometimes drastically affect rates of convergence.1 In this section, we shall show that further use can be made of the Perron-Frobenius theory of nonnegative matrices in estimating these key eigenvalues. KeywordsLiner EquationChebyshev PolynomialActual ComputationReal EigenvalueTheoretical ProblemThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.