Abstract

Abstract : Imagine that one has computed the real n-vectors b, Ab, A2b,..., A(m-1)b where A is a real symmetric nxn matrix. Lanczos showed us in 1950 how to construct a much better basis for the (Krylov) space spanned by these power vectors and for little extra cost. The new basis (q1, q2,..., qm) now called the Lanczos basis, has two nice properties: (1) it is orthonormal, (2) the representation of A's projection is a symmetric tridiagonal matrix Tm. Property (2) is synonymous with the three term recurrence governing the Lanczos vectors. Moreover, some of Tm's eigenvalues, called Ritz values hereafter, are excellent approximations to some of A's eigenvalues even when m << n. In addition we can tell, with little expense, which Ritz values are also eigenvalues. One surprising implication of these properties is that it is easier to find the largest few eigenvalues of A than to solve Ax=b! When the Lanczos algorithm is implemented in a computer the user discovers an unpleasant fact. Property (1) fails completely for m as small as 20 or 30 and consequently the computed Tm's relation to A is unclear. Lanczos was aware of this blemish and proposed the obvious remedy: keep applying the Gram-Schmidt process to each new Lanczos vector as it is computed. The catch here is that all the (qi) must be kept handy whereas in exact arithmetic only the three latest Lanczos vectors are needed and earlier q's may be discarded. The arithmetic cost of this full reorthogonalization grows quadratically with m. So the hope of computing Tn efficiently and accurately by the Lanczos algorithm was dashed and other methods prevailed. In exact arithmetic Tn is similar to A and the algorithm stops. (AN)

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call