Abstract

We measure the evolution of the X-ray luminosity–temperature (LX − T) relation since z ∼ 1.5 using a sample of 211 serendipitously detected galaxy clusters with spectroscopic redshifts drawn from the XMM Cluster Survey first data release (XCS-DR1). This is the first study spanning this redshift range using a single, large, homogeneous cluster sample. Using an orthogonal regression technique, we find no evidence for evolution in the slope or intrinsic scatter of the relation since z ∼ 1.5, finding both to be consistent with previous measurements at z ∼ 0.1. However, the normalization is seen to evolve negatively with respect to the self-similar expectation: we find E−1(z) LX = 1044.67 ± 0.09(T/5)3.04 ± 0.16(1 + z)−1.5 ± 0.5, which is within 2σ of the zero evolution case. We see milder, but still negative, evolution with respect to self-similar when using a bisector regression technique. We compare our results to numerical simulations, where we fit simulated cluster samples using the same methods used on the XCS data. Our data favour models in which the majority of the excess entropy required to explain the slope of the LX − T relation is injected at high redshift. Simulations in which active galactic nucleus feedback is implemented using prescriptions from current semi-analytic galaxy formation models predict the positive evolution of the normalization, and differ from our data at more than 5σ. This suggests that more efficient feedback at high redshift may be needed in these models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call