Abstract

Self-concordant functions are a special class of convex functions in Euclidean space introduced by Nesterov. They are used in interior point methods, based on Newton iterations, where they play an important role in solving efficiently certain constrained optimization problems. The concept of self-concordant functions has been defined on Riemannian manifolds by Jiang et al. and a damped Newton method developed for this context. As a further development, this paper proposes a damped conjugate gradient method, which is an ordinary conjugate gradient method but with a novel step-size selection rule which is proved to ensure the algorithm converges to the global minimum. The advantage of the damped conjugate gradient algorithm over the damped Newton method is that the former has a lower computational complexity. To illustrate the advantages, the algorithm is applied to find the center of mass of given points on a hyperboloid model, known as the Karcher mean.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.