Abstract

A fundamental investigation of turbulence-radiation interaction (TRI) is presented in this paper, focused on the correlations that arise in the time-averaging of the radiative absorption and radiative emission. The analyses are based on transient data generated from fully-coupled, high-resolution large eddy simulations of a set of turbulent, large-scale pool fires. Coefficients that quantify the magnitude of the individual correlation are identified and, using the transient data, they are computed throughout the computational domain and compared to one another. Results show that turbulent fluctuations lead to an increase in both the mean emission and the mean absorption. For the latter, the increase is particularly pronounced in the hot gas plume region. In this region, the cross-correlation between the local absorption coefficient and the incident radiation accounts for no more than 15% of the total absorption, and disregarding it leads to negligible errors in the prediction of the mean radiation loss, which points to the applicability of the optically thin fluctuation approximation. Among the correlations in the time-averaged emission term, the temperature autocorrelation is the most important one, followed by the absorption coefficient-temperature cross-correlation. However, radiative transfer calculations considering different closure methods for the emission term evidence that accounting for the temperature autocorrelation alone is not sufficient for an accurate solution of the mean radiation field. Instead, all emission-TRI correlations must be considered for a proper representation of the mean radiative emission.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call