Abstract

The article discusses machine learning algorithms for predicting turbulent viscosity using the case of flow past the backward-facing step. The training data is obtained by calculations using the OpenFOAM software package and a turbulence model. The significance of flow parameters, including velocity fluctuations, pressure and velocity gradients, strain rate tensor and their combinations and invariants are analyzed for predicting turbulent viscosity. Different machine learning algorithms are compared. It is found that the most optimal algorithm for predicting turbulent viscosity in this case is the Decision Tree Regressor. Using the chosen model, the distribution of turbulent viscosity in the computational domain is predicted for various Reynolds numbers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call