Abstract Galaxy cluster cores are pervaded by hot gas which radiates at far too high a rate to maintain any semblance of a steady state; this is referred to as the cooling flow problem. Of the many heating mechanisms that have been proposed to balance radiative cooling, one of the most attractive is the dissipation of acoustic waves generated by active galactic nuclei. Fabian et al. showed that if the waves are nearly adiabatic, wave damping due to heat conduction and viscosity must be well below standard Coulomb rates in order to allow the waves to propagate throughout the core. Because of the importance of this result, we have revisited wave dissipation under galaxy cluster conditions in a way that accounts for the self-limiting nature of dissipation by electron thermal conduction, allows the electron and ion temperature perturbations in the waves to evolve separately, and estimates kinetic effects by comparing to a semicollisionless theory. While these effects considerably enlarge the toolkit for analyzing observations of wavelike structures and developing a quantitative theory for wave heating, the drastic reduction of transport coefficients proposed in Fabian et al. remains the most viable path to acoustic wave heating of galaxy cluster cores.
Read full abstract