We propose a quadratically convergent algorithm for inverse symmetric eigenvalue problems based on matrix equations. The basic idea is seen in a recent study by Ogita and Aishima, while they derive an efficient iterative refinement algorithm for symmetric eigenvalue problems using special matrix equations. In other words, this study is interpreted as a unified view on quadratically convergent algorithms for eigenvalue problems and inverse eigenvalue problems based on matrix equations. To the best of our knowledge, such a unified development of algorithms is provided for the first time. Since the proposed algorithm for the inverse eigenvalue problems can be regarded as the Newton's method for the matrix equations, the quadratic convergence is naturally proved. Our algorithm is interpreted as an improved version of the Cayley transform method for the inverse eigenvalue problems. Although the Cayley transform method is one of the effective iterative methods, the Cayley transform takes O(n3) arithmetic operations to produce an orthogonal matrix using a skew-symmetric matrix in each iteration. Our algorithm can refine orthogonality without the Cayley transform, which reduces the operations in each iteration. It is worth noting that our approach overcomes the limitation of the Cayley transform method to the inverse standard eigenvalue problems, resulting in an extension to inverse generalized eigenvalue problems.
Read full abstract