The objective of this study is to assess the effects of Turbulence–Radiation Interactions (TRI) on the structure of small-scale pool fires and on the radiative fluxes transferred to surrounding surfaces. Fire-induced flow is modeled by using a buoyancy-modified k-ε model and the Steady Laminar Flamelet (SLF) model coupled with a presumed Probability Density Function (pdf) approach. A 34-kW methane pool fire produced by burner with a diameter of 0.38 m is simulated by neglecting radiation, by considering radiation without TRIs, and by considering radiation with TRIs. Computations carried out with radiation are based on the Full Spectrum Correlated-k (FSCK) method. TRIs are taken into account by considering the Optically-Thin Fluctuation Approximation (OTFA). The mean radiative source term and the mean RTE are then closed by using a presumed pdf of the mixture fraction, scalar dissipation rate, and enthalpy defect. When TRIs are considered predicted flame structure, radiant fraction and radiative fluxes are found in quantitative agreement with the available experimental data. Simulations reveal that TRIs significantly enhance radiative losses and substantially contribute to the drop in temperature due to radiation. TRIs also contribute to reduce turbulence levels and the root mean square (rms) values of temperature fluctuations. In addition radiative heat fluxes on remote targets are found to be considerably higher than those obtained from radiative calculations based on mean properties. Finally different levels of closure for the TRI-related terms are assessed. Model results show that the complete absorption coefficient-Planck function correlation should be considered in order to properly take into account the influence of TRIs on the emission term whereas the effect of absorption coefficient self-correlation on the absorption term is a reduction by about 12% of the radiant fraction.
Read full abstract