Abstract

The lag-luminosity relation for gamma-ray bursts (GRBs) is an anti-correlation between the time-lag, τlag, which represents the delay between the arrival of high-energy and low-energy photons, and the isotropic peak luminosity, L. In this paper, we investigate the possible redshift evolution of this relation by using a sample of 31 Swift bursts. Our analysis consists of binning the data in the redshift, z , then applying a fit of the form: log(L) = A + B log(τlag0/〈τlag0〉) for each bin, whereτlag0 is the time-lag in the burst’s rest frame, and 〈τlag0〉 is the corresponding mean value for the entire sample. The objective is to see whether the two fitting parameters, A and B , evolve in a systematic way with z. Our results indicate that the normalization, A, does seem to vary in a systematic way with redshift. The slope, B, does show some dependence on z, but this dependence is less pronounced than that of A. It should be noted, however, that although good best-fits were obtained, with reasonable values for both the linear regression coefficient, r, and the reduced chi-squared, the data showed large scatter. Furthermore, the number of GRBs in the sample studied is rather small, and so our conclusions are only tentative at this point. A flat universe with ΩM = 0.27, ΩΛ = 0.73 , and a Hubble constant, H0 = 70 km s−1 Mpc−1 is assumed.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.