Abstract

This chapter discusses Chebyshev acceleration and its application to basic iterative methods. Chebyshev acceleration can significantly improve the convergence rate when optimum iteration parameters are used. The chapter presents computational algorithms, which generate the necessary Chebyshev iteration parameters adaptively during the iteration process. Many iterations are often required before the asymptotic convergence is achieved. It is very difficult to improve an overestimated value even if one is available. If ME is an underestimate for M(G), then improved estimates for M(G) can be obtained by using the adaptive procedures. Chebyshev acceleration is relatively insensitive to the estimate mE as long as mE ≤ m(G).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call