Abstract

The projected gradient methods treated here generate iterates by the rulexk+1=PΩ(x k −s k ∇F(x k )),x1 ∈ Ω, where Ω is a closed convex set in a real Hilbert spaceX,s k is a positive real number determined by a Goldstein-Bertsekas condition,PΩ projectsX into Ω,F is a differentiable function whose minimum is sought in Ω, and ∇F is locally Lipschitz continuous. Asymptotic stability and convergence rate theorems are proved for singular local minimizers ξ in the interior of Ω, or more generally, in some open facet in Ω. The stability theorem requires that: (i) ξ is a proper local minimizer andF grows uniformly in Ω near ξ; (ii) −∇F(ξ) lies in the relative interior of the coneKξ of outer normals to Ω at ξ; and (iii) ξ is an isolated critical point and the defect ∥PΩ(x − ∇F(x)) −x∥ grows uniformly within the facet containing ξ. The convergence rate theorem imposes (i) and (ii), and also requires that: (iv)F isC4 near ξ and grows no slower than ∥x−ξ∥4 within the facet; and (v) the projected Hessian operatorP F ξ ∇2F(ξ)Fξ is positive definite on its range in the subspaceFξ orthogonal toKξ. Under these conditions, {x k } converges to ξ from nearby starting pointsx1, withF(x k ) −F(ξ) =O(k−2) and ∥x k − ξ∥ =O(k−1/2). No explicit or implied local pseudoconvexity or level set compactness demands are imposed onF in this analysis. Furthermore, condition (v) and the uniform growth stipulations in (i) and (iii) are redundant in ℝ n .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call