Abstract

We show that the observed mass-to-light (M/L) ratio of galaxy clusters increases with cluster temperature as expected from cosmological simulations. Contrary to previous observational suggestions, we find a mild but robust increase of M/L from poor (T ~ 1-2 keV) to rich (T ~ 12 keV) clusters; over this range, the mean M/LV increases by a factor of about 2. The best-fit relation satisfies M/LV = (170 ± 30)T h at z = 0, with a large scatter. This trend confirms predictions from cosmological simulations that show that the richest clusters are antibiased, with a higher ratio of mass per unit light than average. The antibias increases with cluster temperature. The effect is caused by the relatively older age of the high-density clusters, where light has declined more significantly than average since their earlier formation time. Combining the current observations with simulations, we find a global value of M/LV = 240 ± 50 h and a corresponding mass density of the universe of Ωm = 0.17 ± 0.05.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.