Abstract

Local thermal instability can plausibly explain the formation of multiphase gas in many different astrophysical environments, but the theory of local TI is only well-understood in the optically thin limit of the equations of radiation hydrodynamics (RHD). Here, we lay groundwork for transitioning from this limit to a full RHD treatment assuming a gray opacity formalism. We consider a situation where the gas becomes thermally unstable due to the hardening of the radiation field when the main radiative processes are free–free cooling and Compton heating. We identify two ways in which this can happen: (i) when the Compton temperature increases with time, through a rise in either the intensity or energy of a hard X-ray component; and (ii) when attenuation reduces the flux of the thermal component such that the Compton temperature increases with depth through the slab. Both ways likely occur in the broad-line region of active galactic nuclei where columns of gas can be ionization-bounded. In such instances where attenuation is significant, thermal equilibrium solution curves become position-dependent and it no longer suffices to assess the stability of an irradiated column of gas at all depths using a single equilibrium curve. We demonstrate how to analyze a new equilibrium curve—the attenuation curve—for this purpose, and we show that, by Field’s instability criterion, a negative slope along this curve indicates that constant-density slabs are thermally unstable whenever the gas temperature increases with depth.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call