Abstract

The method of steepest descent for solving unconstrained minimization problems is well understood. It is known, for instance, that when applied to a smooth objective function f, and converging to a solution point x where the corresponding Hessian matrix F(x) is positive definite, the asymptotic rate of convergence is given by the Kantorovich ratio (β − α)2/(β + α)2, where α and β are respectively the smallest and largest eigenvalues of the Hessian matrix F(x). This result is one of the major sharp results on convergence of minimization algorithms. In this paper a corresponding result is given for the gradient projection method for solving constrained minimization problems. It is shown that the asymptotic rate of convergence of gradient projection methods is also given by a Kantorovich ratio, but with α and β being determined by the Lagrangian associated with the problem. Specifically, if L is the Hessian of the Lagrangian evaluated at the solution, α and β are the smallest and largest eigenvalues of L when restricted to the subspace tangent to the constraint surface. This result is a natural extension of the one for unconstrained problems. Unlike the unconstrained situation where linear analysis is natural, the constrained situation is inherently nonlinear since analysis must be confined to the constraint surface. This technical difficulty would obscure the basic simplicity of the analysis if it were not for the introduction of the concept of geodesic descent which restores order to an otherwise potentially chaotic and unexciting analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call