Machine learning (ML) methods have found some applications in structural topology optimization. In the existing methods, however, the ML models need to be retrained when the design domains and supporting conditions have been changed, posing a limitation to their wide applications. In this paper, we propose a one-time training ML (OTML) method for general topology optimization, where the self-attention convolutional long short-term memory (SaConvLSTM) model is introduced to update the design variables. An extension–division approach is used to enrich the training sets. By developing a splicing strategy, the training results of a small design space (i.e., a basic cell of either two- or three-dimensions) can be extended to tackling the optimization problem of a large design domain with arbitrary geometric shapes. Using the OTML method, the ML model needs to be trained for only one time, and the trained model can be used directly to solve various optimization problems with arbitrary shapes of design domains, loads, and boundary conditions. In the SaConvLSTM model, the material volume of the post-processed thresholded designs can be precisely controlled, though the control precision of the gray-scale designs might be slightly sacrificed. The effects of model parameters on the computational cost and the result quality are examined. Four examples are provided to demonstrate the high performance of this structural design method. For large-scale optimization problems, the present method can accelerate the structural form-finding process. This study holds a promise in the high-resolution structural form-finding and transdisciplinary computational morphogenesis.
Read full abstract