Abstract

This article deals with the Grassmann manifold as a submanifold of the matrix Euclidean space, that is, as the set of all orthogonal projection matrices of constant rank, and sets up several optimization algorithms in terms of such matrices. Interest will center on the steepest descent and Newton’s methods together with applications to matrix eigenvalue problems. It is shown that Newton’s equation in the proposed Newton’s method applied to the Rayleigh quotient minimization problem takes the form of a Lyapunov equation, for which an existing efficient algorithm can be applied, and thereby the present Newton’s method works efficiently. It is also shown that in case of degenerate eigenvalues the optimal solutions form a submanifold diffeomorphic to a Grassmann manifold of lower dimension. Furthermore, to generate globally converging sequences, this article provides a hybrid method composed of the steepest descent and Newton’s methods on the Grassmann manifold together with convergence analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call