Abstract

Global dynamic optimization problems are often represented as nonlinear optimization problems with embedded parametric ordinary differential equations. Deterministic methods for global optimization typically employ subgradients of convex relaxations to construct crucial lower bounds. Our recent work shows that subgradients in dynamic optimization problems may be obtained by adapting standard forward or adjoint sensitivity approaches; the adjoint approach ought to be computationally favorable except for small problems. However, established adjoint implementations are incompatible with established libraries for subgradient evaluation. At FOCAPO/CPC 2023, we outlined an automated proof-of-concept implementation of adjoint subgradient evaluation in C++, adapting the convexification package MC++, the ODE solver CVODES, and our own differentiation and code generation tools. This article details how these tools were nontrivially adapted, to ultimately implement adjoint ODE sensitivities embedded with either the forward or reverse modes of subgradient automatic differentiation (AD). Numerical examples are presented for illustration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call