This work is a continuation of a previous effort (Panaitescu) to study the cooling of relativistic electrons through radiation (synchrotron and self-Compton) emission and adiabatic losses, with application to the spectra and light curves of the synchrotron gamma-ray burst (GRB) produced by such cooling electrons. Here, we derive the low-energy slope β LE of a GRB pulse-integrated spectrum and quantify the implications of the measured distribution of β LE. Radiative processes that produce soft integrated spectra can accommodate the harder slopes measured by CGRO/BATSE and Fermi/GBM only if the magnetic field lifetime t B is shorter than the time during which the typical GRB electrons cool to radiate below 1–10 keV, which is less than (at most) 10 radiative cooling timescales t rad of the typical GRB electron. In this case, there is a one-to-one correspondence between t B and β LE. To account for low-energy slopes β LE > −3/4, the adiabatic electron-cooling requires a similar restriction on t B . In this case, the diversity of slopes arises mostly from how the electron-injection rate varies with time (temporal power-law injection rates yield power-law low-energy GRB spectra) and not from the magnetic field timescale.
Read full abstract