Abstract
This article is concerned with the problem of minimizing a smooth function over the Stiefel manifold. In order to address this problem, we introduce two adaptive scaled gradient projection methods that incorporate scaling matrices that depend on the step-size and a parameter that controls the search direction. These iterative algorithms use a projection operator based on the QR factorization to preserve the feasibility in each iteration. However, for some particular cases, the proposals do not require the use of any projection operator. In addition, we consider a Barzilai and Borwein-like step-size combined with the Zhang–Hager nonmonotone line-search technique in order to accelerate the convergence of the proposed procedures. We proved the global convergence for these schemes, and we evaluate their effectiveness and efficiency through an extensive computational study, comparing our approaches with other state-of-the-art gradient-type algorithms.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have