Abstract

We propose a stochastic variance-reduced cubic regularized Newton algorithm to optimize the finite-sum problem over a Riemannian submanifold of the Euclidean space. The proposed algorithm requires a full gradient and Hessian update at the beginning of each epoch while it performs stochastic variance-reduced updates in the iterations within each epoch. The iteration complexity of $$O(\epsilon ^{-3/2})$$ to obtain an $$(\epsilon ,\sqrt{\epsilon })$$ -second-order stationary point, i.e., a point with the Riemannian gradient norm upper bounded by $$\epsilon $$ and minimum eigenvalue of Riemannian Hessian lower bounded by $$-\sqrt{\epsilon }$$ , is established when the manifold is embedded in the Euclidean space. Furthermore, the paper proposes a computationally more appealing modification of the algorithm which only requires an inexact solution of the cubic regularized Newton subproblem with the same iteration complexity. The proposed algorithm is evaluated and compared with three other Riemannian second-order methods over two numerical studies on estimating the inverse scale matrix of the multivariate t-distribution on the manifold of symmetric positive definite matrices and estimating the parameter of a linear classifier on the sphere manifold.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call