Abstract

Abstract Topology optimization is one of the most flexible structural optimization methodologies. However, in exchange for its high level of design freedom, typical topology optimization cannot avoid multimodality, where multiple local optima exist. This study focuses on developing a gradient-free topology optimization framework to avoid being trapped in undesirable local optima. Its core is a data-driven multifidelity topology design (MFTD) method, in which the design candidates generated by solving low-fidelity topology optimization problems are updated through a deep generative model and high-fidelity evaluation. As its key component, the deep generative model compresses the original data into a low-dimensional manifold, i.e., the latent space, and randomly arranges new design candidates over the space. Although the original framework is gradient free, its randomness may lead to convergence variability and premature convergence. Inspired by a popular crossover operation of evolutionary algorithms (EAs), this study merges the data-driven MFTD framework and proposes a new crossover operation called latent crossover. We apply the proposed method to a maximum stress minimization problem in 2D structural mechanics. The results demonstrate that the latent crossover improves convergence stability compared to the original data-driven MFTD method. Furthermore, the optimized designs exhibit performance comparable to or better than that in conventional gradient-based topology optimization using the P-norm measure.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call