Abstract

This paper concerns the application of a long short-term memory model (LSTM) for high-resolution reconstruction of turbulent pressure fluctuation signals from sparse (reduced) data. The model’s training was performed using data from high-resolution computational fluid dynamics (CFD) simulations of high-speed turbulent boundary layers over a flat panel. During the preprocessing stage, we employed cubic spline functions to increase the fidelity of the sparse signals and subsequently fed them to the LSTM model for a precise reconstruction. We evaluated our reconstruction method with the root mean squared error (RMSE) metric and via inspection of power spectrum plots. Our study reveals that the model achieved a precise high-resolution reconstruction of the training signal and could be transferred to new unseen signals of a similar nature with extremely high success. The numerical simulations show promising results for complex turbulent signals, which may be experimentally or computationally produced.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call