Abstract

Galactic Gamma ray astronomy at very high energy (E > 30 TeV) is a vital tool in the study of the non-thermal universe. The interpretation of the observations in this energy region requires the precise modeling of the attenuation of photons due to pair production interactions, where the targets are the radiation fields present in interstellar space. For gamma rays with energy E > 300 TeV the attenuation is mostly due to the photons of the Cosmic Microwave Background Radiation (CMBR). At lower energy the most important target are infrared photons with wavelengths in the range 50-500 micron emitted by dust. The evaluation of the attenuation requires a good knowledge of the density, and energy and angular distributions of the target photons for all positions in the Galaxy. In this work we discuss a simple model for the infrared radiation that depends on only few parameters associated to the space and temperature distributions of the emitting dust. The model allows to compute with good accuracy the effects of absorption for any space and energy distribution of the diffuse Galactic gamma ray emission. The absorption probability due to the Galactic infrared radiation is maximum for E around 150 TeV, and can be as large as P_abs around 0.45 for distant sources on lines of sight that pass close to the Galactic Center. The systematic uncertainties on the absorption probability are estimated as Delta P_abs < 0.08.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call