Abstract

The application of machine learning (ML) algorithms to turbulence modeling has shown promise over the last few years, but their application has been restricted to eddy viscosity based closure approaches. In this article, we discuss the rationale for the application of machine learning with high-fidelity turbulence data to develop models at the level of Reynolds stress transport modeling. Based on these rationales, we compare different machine learning algorithms to determine their efficacy and robustness at modeling the different transport processes in the Reynolds stress transport equations. Those data-driven algorithms include Random forests, gradient boosted trees, and neural networks. The direct numerical simulation (DNS) data for flow in channels are used both as training and testing of the ML models. The optimal hyper-parameters of the ML algorithms are determined using Bayesian optimization. The efficacy of the above-mentioned algorithms is assessed in the modeling and prediction of the terms in the Reynolds stress transport equations. It was observed that all three algorithms predict the turbulence parameters with an acceptable level of accuracy. These ML models are then applied for the prediction of the pressure strain correlation of flow cases that are different from the flows used for training, to assess their robustness and generalizability. This explores the assertion that ML-based data-driven turbulence models can overcome the modeling limitations associated with the traditional turbulence models and ML models trained with large amounts of data with different classes of flows can predict flow field with reasonable accuracy for unknown flows with similar flow physics. In addition to this verification, we carry out validation for the final ML models by assessing the importance of different input features for prediction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call