Abstract
This paper is concerned with the problem min $\{ f(x)\mid x \in X\} $ where X is a convex subset of a linear space H, and f is a smooth real-valued function on H. We propose the class of methods $x_{k + 1} = P(x_k - \alpha _k g_k )$, where P denotes projection on X with respect to a Hilbert space norm $\| \cdot \|$ , $g_k $ denotes the Frechet derivative of f at $x_k $ with respect to another Hilbert space norm $\| \cdot \|_k $ on H, and $\alpha _k $ is a positive scalar stepsize. We thus remove an important restriction in the original proposal of Goldstein [1] and Levitin and Poljak [2], where the norms $\| \cdot \|$ and $\| \cdot \|_k $ must be the same. It is therefore possible to match the norm $\| \cdot \|$ with the structure of X so that the projection operation is simplified while at the same time reserving the option to choose $\| \cdot \|_k $on the basis of approximations to the Hessian of f so as to attain a typically superlinear rate of convergence. The resulting methods are particularly attractive for large-scale problems with specially structured constraint sets such as optimal control and nonlinear multi-commodity network flow problems. The latter class of problems is discussed in some detail.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.