Thermal radiation is an important heat transfer mechanism in fire, but its modeling has often been over-simplified due to computational expense. In this study, we assess whether the Monte Carlo ray tracing (MCRT) method using the line-by-line (LBL) spectral database can serve as a fast and accurate candidate for solving radiation in fire. Using a snapshot extracted from a previous well-resolved simulation of a small turbulent pool fire, the radiation field is solved using a recently developed MCRT-LBL solver with three varying parameters: the number of rays per computational cell, the energy emission scheme, and the resolution of the LBL database. The energy equation is subsequently solved for one time step to assess the propagation of statistical noise into the temperature field. The radiative absorption is well predicted even with only one ray emitted per cell. Adaptive emission, where more rays are emitted at locations with stronger volumetric radiative emission, converges faster than the standard approach, where the same number of rays are emitted from every cell, although the advantage of the former is only pronounced when a sufficiently large number of rays are emitted. The statistical noise is smoothed in the calculation of the radiative heat source because the radiative emission is always predicted accurately regardless of the solver’s parameters. Further damping is provided by the energy equation, yielding a temperature field with negligible noise. Coarsening the LBL database only marginally affects the accuracy of the results, but it reduces the memory required to store the data to less than 10% of that of the original database. Lastly, when soot dominates radiative emission, the statistical noise of the radiation source is further reduced. In conclusion, the study indicates reasonable feasibility for applying MCRT-LBL in fire modeling, and we will further demonstrate this possibility in future studies using larger fires.
Read full abstract