Machine learning was applied to large-eddy simulation (LES) data to develop nonlinear turbulence stress and heat flux closures with increased prediction accuracy for trailing-edge cooling slot cases. The LES data were generated for a thick and a thin trailing-edge slot and shown to agree well with experimental data, thus providing suitable training data for model development. A gene expression programming (GEP) based algorithm was used to symbolically regress novel nonlinear explicit algebraic stress models and heat-flux closures based on either the gradient diffusion or the generalized gradient diffusion approaches. Steady Reynolds-averaged Navier–Stokes (RANS) calculations were then conducted with the new explicit algebraic stress models. The best overall agreement with LES data was found when selecting the near wall region, where high levels of anisotropy exist, as training region, and using the mean squared error of the anisotropy tensor as cost function. For the thin lip geometry, the adiabatic wall effectiveness was predicted in good agreement with the LES and experimental data when combining the GEP-trained model with the standard eddy-diffusivity model. Crucially, the same model combination also produced significant improvement in the predictive accuracy of adiabatic wall effectiveness for different blowing ratios (BRs), despite not having seen those in the training process. For the thick lip case, the match with reference values deteriorated due to the presence of large-scale, relative to slot height, vortex shedding. A GEP-trained scalar flux model, in conjunction with a trained RANS model, was found to significantly improve the prediction of the adiabatic wall effectiveness.
Read full abstract