Abstract

SummaryAcceleration schemes can dramatically improve existing optimization procedures. In most of the work on these schemes, such as nonlinear generalized minimal residual (N‐GMRES), acceleration is based on minimizing the ℓ2 norm of some target on subspaces of . There are many numerical examples that show how accelerating general‐purpose and domain‐specific optimizers with N‐GMRES results in large improvements. We propose a natural modification to N‐GMRES, which significantly improves the performance in a testing environment originally used to advocate N‐GMRES. Our proposed approach, which we refer to as O‐ACCEL (objective acceleration), is novel in that it minimizes an approximation to the objective function on subspaces of . We prove that O‐ACCEL reduces to the full orthogonalization method for linear systems when the objective is quadratic, which differentiates our proposed approach from existing acceleration methods. Comparisons with the limited‐memory Broyden–Fletcher–Goldfarb–Shanno and nonlinear conjugate gradient methods indicate the competitiveness of O‐ACCEL. As it can be combined with domain‐specific optimizers, it may also be beneficial in areas where limited‐memory Broyden–Fletcher–Goldfarb–Shanno and nonlinear conjugate gradient methods are not suitable.

Highlights

  • Gradient-based optimization algorithms normally iterate based on tractable approximations to the objective function at a particular point

  • We propose an acceleration scheme that can be used on top of existing optimization algorithms, which generates a subspace from previous iterates, over which it aims to optimize the objective function

  • The acceleration step consists of solving a small linear system that arises from a linearization of the gradient

Read more

Summary

Summary

Acceleration schemes can dramatically improve existing optimization procedures. In most of the work on these schemes, such as nonlinear generalized minimal residual (N-GMRES), acceleration is based on minimizing the l2 norm of some target on subspaces of Rn. There are many numerical examples that show how accelerating general-purpose and domain-specific optimizers with N-GMRES results in large improvements. Comparisons with the limited-memory Broyden–Fletcher–Goldfarb–Shanno and nonlinear conjugate gradient methods indicate the competitiveness of O-ACCEL. As it can be combined with domain-specific optimizers, it may be beneficial in areas where limited-memory Broyden–Fletcher–Goldfarb–Shanno and nonlinear conjugate gradient methods are not suitable. KEYWORDS acceleration, full orthogonalization method, nonlinear GMRES, optimization

INTRODUCTION
OPTIMIZATION ACCELERATION WITH O-ACCEL
Algorithm
O-ACCEL as a FOM
NUMERICAL EXPERIMENTS
Test problems from De Sterck
Experiment design
Performance profiles
The tensor optimization problem from De Sterck
CUTEst test problems
Findings
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.