Abstract

The detection of the delayed emission in X-ray, optical, and radio bands, i.e., the afterglow of gamma-ray bursts (GRBs), suggests that the sources of GRBs are likely to be at cosmological distances. Here we explore the interaction of a relativistic shell with a uniform interstellar medium (ISM) and obtain the exact solution of the evolution of gamma-ray burst remnants, including the radiative losses. We show that in general the evolution of the bulk Lorentz factor, gamma, satisfies gamma proportional to t(-alpha t) when gamma much greater than 1; here, alpha(t) is mainly in the range 9/22-3/8, the latter corresponding to adiabatic expansion. So it is clear that adiabatic expansion is a good approximation even when radiative loss is considered. However, in fact, alpha(t) is slightly larger than 3/8, which may have some effects on a detailed data analysis. Synchrotron self-absorption is also calculated, and it is demonstrated that the radio emission may become optically thin during the afterglow. Our solution can also apply to the nonrelativistic case (gamma similar to 1); at that time, the observed flux decreases more rapidly than that in the relativistic case.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call