Abstract

A modified robust design optimization approach is presented, which uses the first-order second-moment method to compute the mean value and the standard deviation for arbitrary objective functions. Existing approaches compute the gradient of the variance using the adjoint method, direct differentiation or finite differences, respectively. These approaches either access to the FE-code and/or have high computational cost. In this paper, a new approach for the computation of the gradient of the variance is provided. It can be easily implemented as a non-intrusive method, which behaves similar to finite differences with the cost of only one additional objective evaluation, independent of the number of variables. Here, a step-size has to be chosen carefully and therefore, a procedure to determine a problem-independent step-size is provided. As an alternative, the approach can be implemented as an analytic method with the same cost like the adjoint method, but providing wider applicability (e.g. eigenvalue problems). The provided approach is derived, analyzed and applied to several benchmark examples.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.