Abstract

The Lanczos method is one of the standard approaches for computing a few eigenpairs of a large, sparse, symmetric matrix. It is typically used with restarting to avoid unbounded growth of memory and computational requirements. Thick-restart Lanczos is a popular restarted variant because of its simplicity and numerical robustness. However, convergence can be slow for highly clustered eigenvalues so more effective restarting techniques and the use of preconditioning is needed. In this paper, we present a thick-restart preconditioned Lanczos method, TRPL+K, that combines the power of locally optimal restarting (+K) and preconditioning techniques with the efficiency of the thick-restart Lanczos method. TRPL+K employs an inner-outer scheme where the inner loop applies Lanczos on a preconditioned operator while the outer loop augments the resulting Lanczos subspace with certain vectors from the previous restart cycle to obtain eigenvector approximations with which it thick restarts the outer subspace. We first identify the differences from various relevant methods in the literature. Then, based on an optimization perspective, we show an asymptotic global quasi optimality of a simplified TRPL+K method compared to an unrestarted global optimal method. Finally, we present extensive experiments showing that TRPL+K either outperforms or matches other state-of-the-art eigenmethods in both matrix-vector multiplications and computational time.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.