Abstract

Energy densities of relativistic electrons and protons in extended galactic and intracluster regions are commonly determined from spectral radio and (rarely) $\gamma$-ray measurements. The time-independent particle spectral density distributions are commonly assumed to have a power-law (PL) form over the relevant energy range. A theoretical relation between energy densities of electrons and protons is usually adopted, and energy equipartition is invoked to determine the mean magnetic field strength in the emitting region. We show that for typical conditions, in both star-forming and starburst galaxies, these estimates need to be scaled down substantially due to significant energy losses that (effectively) flatten the electron spectral density distribution, resulting in a much lower energy density than deduced when the distribution is assumed to have a PL form. The steady-state electron distribution in the nuclear regions of starburst galaxies is calculated by accounting for Coulomb, bremsstrahlung, Compton, and synchrotron losses; the corresponding emission spectra of the latter two processes are calculated and compared to the respective PL spectra. We also determine the proton steady-state distribution by taking into account Coulomb and pion production losses, and briefly discuss implications of our steady-state particle spectra for estimates of proton energy densities and magnetic fields.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call