In inverse eigenvalue problems one tries to reconstruct a matrix, satisfying some constraints, given some spectral information. Here, two inverse eigenvalue problems are solved.First, given the eigenvalues and the first components of the associated eigenvectors (called the weight vector) an extended Hessenberg matrix with prescribed poles is computed possessing these eigenvalues and satisfying the eigenvector constraints. The extended Hessenberg matrix is retrieved by executing particularly designed unitary similarity transformations on the diagonal matrix containing the eigenvalues. This inverse problem closely links to orthogonal rational functions: the extended Hessenberg matrix contains the recurrence coefficients given the nodes (eigenvalues), poles (poles of the extended Hessenberg matrix), and a weight vector (first eigenvector components) determining the discrete inner product. Moreover, it is also sort of the inverse of the (rational) Arnoldi algorithm: instead of using the (rational) Arnoldi method to compute a Krylov basis to approximate the spectrum, we will reconstruct the orthogonal Krylov basis given the spectral info.In the second inverse eigenvalue problem, we do the same, but refrain from unitarity. As a result we execute possibly non-unitary similarity transformations on the diagonal matrix of eigenvalues to retrieve a (non)-symmetric extended tridiagonal matrix. The algorithm will be less stable, but it will be faster, as the extended tridiagonal matrix admits a low cost factorization of O(n) (n equals the number of eigenvalues), whereas the extended Hessenberg matrix does not. Again there is a close link with orthogonal function theory, the extended tridiagonal matrix captures the recurrence coefficients of bi-orthogonal rational functions. Moreover, it is again sort of inverse of the nonsymmetric Lanczos algorithm: given spectral properties, we reconstruct the two basis Krylov matrices linked to the nonsymmetric Lanczos algorithm.
Read full abstract