Abstract

This paper proposes a novel general framework of Riemannian conjugate gradient methods, that is, conjugate gradient methods on Riemannian manifolds. The conjugate gradient methods are important first-order optimization algorithms both in Euclidean spaces and on Riemannian manifolds. While various types of conjugate gradient methods are studied in Euclidean spaces, there have been fewer studies on those on Riemannian manifolds. In each iteration of the Riemannian conjugate gradient methods, the previous search direction must be transported to the current tangent space so that it can be added to the negative gradient of the objective function at the current point. There are several approaches to transport a tangent vector to another tangent space. Therefore, there are more variants of the Riemannian conjugate gradient methods than the Euclidean case. In order to investigate them in more detail, the proposed framework unifies the existing Riemannian conjugate gradient methods such as ones utilizing a vector transport or inverse retraction and also develops other methods that have not been covered in previous studies. Furthermore, sufficient conditions for the convergence of a class of algorithms in the proposed framework are clarified. Moreover, the global convergence properties of several specific types of algorithms are extensively analyzed. The analyses provide the theoretical results for some algorithms in a more general setting than the existing studies and completely new developments for the other algorithms. Numerical experiments are performed to confirm the validity of the theoretical results. The results also compare the performances of several specific algorithms in the proposed framework.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call