Abstract
We extend coordinate descent to manifold domains and provide convergence analyses for geodesically convex and nonconvex smooth objective functions. Our key insight is to draw an analogy between coordinate blocks in Euclidean space and tangent subspaces of a manifold. Hence, our method is called tangent subspace descent (TSD). The core principle behind ensuring convergence of TSD is the appropriate choice of subspace at each iteration. To this end, we propose two novel conditions, the (C, r)-norm and C-randomized norm conditions on deterministic and randomized modes of subspace selection, respectively, that promise convergence for smooth functions and that are satisfied in practical contexts. We propose two subspace selection rules, one deterministic and another randomized, of particular practical interest on the Stiefel manifold. Our proof-of-concept numerical experiments on the sparse principal component analysis problem demonstrate TSD’s efficacy. Funding: This work was supported by the National Science Foundation [Grant 1740707] and the Defense Advanced Research Projects Agency Lagrange Program [Grant N660011824020].
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.