Our aim is to understand how the interplay between black hole (BH) feedback and merge processes can effectively turn cool-core galaxy clusters into hot-core clusters in the modern universe. Additionally, we also aim to clarify which parameters of the BH feedback model used in simulations can cause an excess of feedback at the scale of galaxy groups while not efficiently suppressing star formation at the scale of galaxy clusters. To obtain robust statistics of the cool-core population, we compare the modern Universe snapshot (z = 0.25) of the largest Magneticum simulation ( Box2b/hr ) with the eROSITA eFEDS survey and Planck SZ-selected clusters observed with XMM-Newton. Additionally, we compare the BH feedback injected by the simulation in radio mode with Chandra observations of X-ray cavities, and LOFAR observations of radio emission. We confirm a decreasing trend in cool-core fractions towards the most massive galaxy clusters, which is well reproduced by the Magneticum simulations. This evolution is connected with an increased merge activity that injects high-energy particles into the core region, but it also requires thermalisation and conductivity to enhance mixing through the intra-cluster medium core, where both factors are increasingly efficient towards the high mass end. On the other hand, BH feedback remains as the dominant factor at the scale of galaxy groups, while its relative impact decreases towards the most massive clusters. The problems suppressing star formation in simulations are not caused by low BH feedback efficiencies. They root in the definition of the black hole sphere of influence used to distribute the feedback, which decreases as density and accretion rate increase. Actually, a decreasing BH feedback efficiency towards low-mass galaxy groups is required to prevent overheating. These problems can be addressed in simulations by using relations between accretion rate, cavity power, and cavity reach derived from X-ray observations.
Read full abstract