Abstract

Studies of the X-ray surface brightness profiles of clusters, coupled with theoretical considerations, suggest that the breaking of self-similarity in the hot gas results from an `entropy floor', established by some heating process, which affects the structure of the intracluster gas strongly in lower mass systems. Fitting analytical models for the radial variation in gas density and temperature to X-ray spectral images from the ROSAT PSPC and ASCA GIS, we derive gas entropy profiles for 20 galaxy clusters and groups. Scaling these profiles to coincide in the self-similar case, the lowest mass systems are found to have higher scaled entropy profiles than more massive systems. This appears to be due to a baseline entropy of 70-140 h50^-1/3 keV cm^2, depending on the extent to which shocks have been suppressed in low mass systems. The extra entropy may be present in all systems, but is detectable only in poor clusters, compared to the entropy generated by gravitational collapse. This excess entropy appears to be distributed uniformly with radius outside the central cooling regions. We determine the energy associated with this entropy floor, by studying the net reduction in binding energy of the gas in low mass systems, and find that it corresponds to a preheating temperature of ~0.3 keV. Since the relationship between entropy and energy injection depends upon gas density, we can combine the excesses of 70-140 keV cm^2 and 0.3 keV to derive the typical electron density of the gas into which the energy was injected. The resulting value of 1-3x10^-4 h50^1/2 cm-3, implies that the heating must have happened prior to cluster collapse but after a redshift z~7-10. The energy requirement is well matched to the energy from supernova explosions responsible for the metals which now pollute the intracluster gas.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call