Abstract

ABSTRACT There is considerable evidence for widespread subsonic turbulence in galaxy clusters, most notably from Hitomi. Turbulence is often invoked to offset radiative losses in cluster cores, both by direct dissipation and by enabling turbulent heat diffusion. However, in a stratified medium, buoyancy forces oppose radial motions, making turbulence anisotropic. This can be quantified via the Froude number Fr, which decreases inward in clusters as stratification increases. We exploit analogies with MHD turbulence to show that wave–turbulence interactions increase cascade times and reduce dissipation rates ϵ ∝ Fr. Equivalently, for a given energy injection/dissipation rate ϵ, turbulent velocities u must be higher compared to Kolmogorov scalings. High-resolution hydrodynamic simulations show excellent agreement with the ϵ ∝ Fr scaling, which sets in for Fr ≲ 0.1. We also compare previously predicted scalings for the turbulent diffusion coefficient D ∝ Fr2 and find excellent agreement, for Fr ≲ 1. However, we find a different normalization, corresponding to stronger diffusive suppression by more than an order of magnitude. Our results imply that turbulent diffusion is more heavily suppressed by stratification, over a much wider radial range, than turbulent dissipation. Thus, the latter potentially dominates. Furthermore, this shift implies significantly higher turbulent velocities required to offset cooling, compared to previous models. These results are potentially relevant to turbulent metal diffusion in the galaxy groups and clusters (which is likewise suppressed), and to planetary atmospheres.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call