Abstract

Conditional gradient algorithms with implicit line minimization and Goldstein–Armijo step length rules are considered for the problem $\min _\Omega F$ with $\Omega $ a bounded convex subset of a real Banach space. When the Fréchet derivative $F'$ is uniformly continuous on $\Omega $, the iterates $x_n $ generated by any of the algorithms comprise an “extremizing” sequence in the sense that the quantity, $\langle {F'(x_n ),x_n } \rangle - \inf _{y \in \Omega } \langle {F'(x_n ),y} \rangle $, converges to zero as $n \to \infty $. This ensures that every limit point of $\{ x_n \} $ is an extremal, and for compact $\Omega $ it then follows that $\{ x_n \} $ converges to the set of extremals in $\Omega $. Weak counterparts of these results are also established. Convergence rate estimates are derived for convex F and Lipschitz continuous $F'$. These estimates are closely related to results obtained in an earlier investigation of two explicit step length formulas for conditional gradient methods. Once again, the growth rate of the function $a(\sigma ) = \inf \{ \rho = \langle {F'(\xi ),x - \xi } \rangle \mid x \in \Omega ,\| {x - \xi } \| \geqq \sigma \} $ at an extremal $\xi $, determines how rapidly the functional values $F(x_n )$ converge to $\inf _\Omega F$.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.