Abstract

A fast-computing radiative heat transfer on-the-fly interpolation database was developed for high temperature radiative-gas dynamic simulations under local thermodynamic equilibrium (LTE). The absorption coefficient includes Atomic (Bound-Bound, Bound-Free, Free-Free) transitions and Molecular band systems. A 2D finite-volume radiative transfer equation (RTE) solver was developed and coupled with a compressible reacting flow solver to simulate a laser-induced decaying spark (LIDS) in air. Part spectrum correlated k-distribution with Gauss-Chebyshev quadrature and mixing model were applied to reduce the number of RTE runs. Non-collimated radiative heat loss was calculated in LIDS and the effects of radiation loss on the shock wave and kernel cooling were investigated. Total radiation loss was accounted to be approximately 2.3% of the absorbed energy which happens in a micro-second timescale and most of the kernel energy is taken away by the shock wave. Two simulations with and without RTE were compared to analyse the effects of radiative transfer. The radiation loss results in attenuation of shock wave velocity and delaying the emanation of blast wave. The cooling of the plasma kernel is accelerated due to energy loss by radiative transfer. Radiation increases the cooling rates in the primary cooling regime and decreases during the secondary regime due to an energy deficit in the kernel. The transition from primary to the secondary cooling regime is not uniform throughout the kernel.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call