Transformer has achieved remarkable results in various fields, including its application in modeling dynamic systems governed by partial differential equations. However, transformer still face challenges in achieving long-term stable predictions for three-dimensional turbulence. In this paper, we propose an implicit factorized transformer (IFactFormer) model, which enables stable training at greater depths through implicit iteration over factorized attention. IFactFormer is applied to large eddy simulation of three-dimensional homogeneous isotropic turbulence (HIT), and is shown to be more accurate than the FactFormer, Fourier neural operator (FNO), and dynamic Smagorinsky model (DSM) in the prediction of the velocity spectra, probability density functions of velocity increments and vorticity, temporal evolutions of velocity and vorticity root-mean-square value and isosurface of the normalized vorticity. IFactFormer can achieve long-term stable predictions of a series of turbulence statistics in HIT. Furthermore, IFactFormer showcases superior computational efficiency compared to the conventional DSM in large eddy simulation.
Read full abstract