Abstract

Space radiation exposure from omnipresent Galactic Cosmic Rays (GCRs) in interplanetary space poses a serious carcinogenic risk to astronauts due to the—limited or absent—protective effect of the Earth’s magnetosphere and, in particular, the terrestrial atmosphere. The radiation risk is directly influenced by the quality of the radiation, i.e., its pattern of energy deposition at the micron/DNA scale. For stochastic biological effects, radiation quality is described by the quality factor, Q, which can be defined as a function of Linear Energy Transfer (LET) or the microdosimetric lineal energy (y). In the present work, the average Q of GCR for different mission scenarios was calculated using a modified version of the microdosimetric Theory of Dual Radiation Action (TDRA). NASA’s OLTARIS platform was utilized to generate the radiation environment behind different aluminum shielding (0–30 g/cm2) for a typical mission scenario in low-earth orbit (LEO) and in deep space. The microdosimetric lineal energy spectra of ions (Zge 1) in 1 μm liquid water spheres were calculated by a generalized analytical model which considers energy-loss fluctuations and δ-ray transport inside the irradiated medium. The present TDRA-based Q-values for the LEO and deep space missions were found to differ by up to 10% and 14% from the corresponding ICRP-based Q-values and up to 3% and 6% from NASA’s Q-model. In addition, they were found to be in good agreement with the Q-values measured in the International Space Station (ISS) and by the Mars Science Laboratory (MSL) Radiation Assessment Detector (RAD) which represent, respectively, a LEO and deep space orbit.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call