Abstract
A method for solving a partial algebraic eigenvalues problem is constructed. It exploits tensor structure of eigenvectors in two-dimensional case. For a symmetric matrix represented in tensor format, the method finds low-rank approximations to the eigenvectors corresponding to the smallest eigenvalues. For sparse matrices, execution time and required memory for the proposed method are proportional to the square root of miscellaneous overall number of unknowns, whereas this dependence is usually linear. To maintain tensor structure of vectors at each iteration step, low-rank approximations are performed, which introduces errors into the original method. Nevertheless, the new method was proved to converge. Convergence rate estimates are obtained for various tensor modifications of the abstract one-step method. It is shown how the convergence of a multistep method can be derived from the convergence of the corresponding one-step method. Several modifications of the method with an low-rank approximation techniques were implemented on the basis of the block conjugate gradient method. Their performance is compared on numerical examples.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have