Abstract

With the goal to deal with a series of optimization problems on general matrix manifolds with differentiable objective functions, we propose an accelerated hybrid Riemannian conjugate gradient technique. Specifically, the acceleration scheme of the proposed method using a modified stepsize which is multiplicatively determined by the Wolfe line search. The search direction of the proposed algorithm is determined by the hybrid conjugate parameter with computationally promising. We showed that the suggested approach converges globally to a stationary point. Our approach performs better than the state of art Riemannian conjugate gradient algorithms, as illustrated by computations on problems such as the orthogonal Procrustes problem and the Brockett-cost-function minimization problem.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call