A modified robust design optimization approach is presented, which uses the first-order second-moment method to compute the mean value and the standard deviation for arbitrary objective functions. Existing approaches compute the gradient of the variance using the adjoint method, direct differentiation or finite differences, respectively. These approaches either access to the FE-code and/or have high computational cost. In this paper, a new approach for the computation of the gradient of the variance is provided. It can be easily implemented as a non-intrusive method, which behaves similar to finite differences with the cost of only one additional objective evaluation, independent of the number of variables. Here, a step-size has to be chosen carefully and therefore, a procedure to determine a problem-independent step-size is provided. As an alternative, the approach can be implemented as an analytic method with the same cost like the adjoint method, but providing wider applicability (e.g. eigenvalue problems). The provided approach is derived, analyzed and applied to several benchmark examples.
Read full abstract