Abstract

AbstractDirect search methods represent a robust and reliable class of algorithms for solving black-box optimization problems. In this paper, the application of those strategies is exported to Riemannian optimization, wherein minimization is to be performed with respect to variables restricted to lie on a manifold. More specifically, classic and linesearch extrapolated variants of direct search are considered, and tailored strategies are devised for the minimization of both smooth and nonsmooth functions, by making use of retractions. A class of direct search algorithms for minimizing nonsmooth objectives on a Riemannian manifold without having access to (sub)derivatives is analyzed for the first time in the literature. Along with convergence guarantees, a set of numerical performance illustrations on a standard set of problems is provided.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call