Abstract
We propose a novel computational strategy, named the adaptive localized Cayley parametrization technique for acceleration of optimization over the Stiefel manifold. The proposed optimization algorithm is designed as a gradient descent type scheme for the composite of the original cost function and the inverse of the localized Cayley transform defined on the vector space of all skew-symmetric matrices. Thanks to the adaptive localized Cayley transform which is a computable diffeomorphism between the orthogonal group and the vector space of the skew-symmetric matrices, the proposed algorithm (i) is free from the singularity issue, which can cause performance degradation, observed in the dual Cayley parametrization technique [Yamada- Ezaki’03] as well as (ii) can enjoy powerful arts for acceleration on the vector space without suffering from the nonlinear nature of the Stiefel manifold. We also present a convergence analysis, for the prototype algorithm employing the Armijo’s rule, that shows the gradient of the composite function at zero in the range space of the localized Cayley transform is guaranteed to converge to zero. Numerical experiments show excellent performance compared with major optimization algorithms designed essentially with retractions on the tangent space of the Stiefel manifold [Absil-Mahony-Sepulcher’08, Wen-Yin’13].
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.