Abstract

The well-known symmetric rank-one trust-region method--where the Hessian approximation is generated by the symmetric rank-one update--is generalized to the problem of minimizing a real-valued function over a $$d$$ d -dimensional Riemannian manifold. The generalization relies on basic differential-geometric concepts, such as tangent spaces, Riemannian metrics, and the Riemannian gradient, as well as on the more recent notions of (first-order) retraction and vector transport. The new method, called RTR-SR1, is shown to converge globally and $$d+1$$ d + 1 -step q-superlinearly to stationary points of the objective function. A limited-memory version, referred to as LRTR-SR1, is also introduced. In this context, novel efficient strategies are presented to construct a vector transport on a submanifold of a Euclidean space. Numerical experiments--Rayleigh quotient minimization on the sphere and a joint diagonalization problem on the Stiefel manifold--illustrate the value of the new methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call