We investigate synchrotron emission models as the source of gamma-ray burst spectra. We show that including the possibility for synchrotron self-absorption, a ``smooth cutoff'' to the electron energy distribution, and an anisotropic distribution for the electron pitch angles produces a whole range of low energy spectral behavior. In addition, we show that the procedure of spectral fitting to GRB data over a finite bandwidth can introduce a spurious correlation between spectral parameters - in particular, the value of the peak of the nu F_nu spectrum, E_p, and the low energy photon spectral index alpha (the lower E_p is, the lower (softer) the fitted value of alpha will be). From this correlation and knowledge of the E_p distribution, we show how to derive the expected distribution of alpha. We show that optically thin synchrotron models with an isotropic electron pitch angle distribution can explain the distribution of alpha below alpha=-2/3. This agreement is achieved if we relax the unrealistic assumption of the presence of a sharp low energy cutoff in the spectrum of accelerated electrons, and allow for a more gradual break. We show that this low energy portion of the electron spectrum can be at most flat. We also show that optically thin synchrotron models with an anisotropic electron pitch angle distribution can explain all bursts with -2/3 < alpha <= 0$. The very few bursts with low energy spectral indices that fall above alpha=0 may be due the presence of a the synchrotron self-absorption frequency entering the lower end of the BATSE window. Our results also predict a particular relationship between alpha and E_p during the temporal evolution of a GRB. We give examples of spectral evolution in GRBs and discuss how the behavior are consistent with the above models.
Read full abstract