Abstract

In design optimization and parameter identification, the objective, or response function(s) are typically linked to the actually independent variables through equality constraints, which we will refer to as state equations. Our key assumption is that it is impossible to form and factor the corresponding constraint Jacobian, but one has instead some fixed-point algorithm for computing a feasible state, given any reasonable value of the independent variables. Assuming that this iteration is eventually contractive, we will show how reduced gradients (Jacobians) and Hessians (in other words, the total derivatives) of the response(s) with respect to the independent variables can be obtained via algorithmic, or automatic, differentiation (AD). In our approach the actual application of the so-called reverse, or adjoint differentiation mode is kept local to each iteration step. Consequently, the memory requirement is typically not unduly enlarged. The resulting approximating Lagrange multipliers are used to compute estimates of the reduced function values that can be shown to converge twice as fast as the underlying state space iteration. By a combination with the forward mode of AD, one can also obtain extra-accurate directional derivatives of the reduced functions as well as feasible state space directions and the corresponding reduced or projected Hessians of the Lagrangian. Our approach is verified by test calculations on an aircraft wing with two responses, namely, the lift and drag coefficient, and two variables, namely, the angle of attack and the Mach number. The state is a 2-dimensional flow field defined as solution of the discretized Euler equation under transonic conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call