The role of radiolytic oxygen consumption for the in-vitro "Ultra-High Dose Rate" (UHDR) sparing and in-vivo FLASH effect is subject to active debate, but data on key dependencies such as the radiation quality are lacking. The influence of "dose-averaged Linear Energy Transfer" (LETd) and dose rate on radiolytic oxygen consumption was investigated by monitoring the oxygen concentration during irradiation with electrons, protons, helium, carbon, and oxygen ions at UHDR and "Standard Dose Rates" (SDR). Sealed "Bovine Serum Albumin" (BSA) 5% samples were exposed to 15Gy of electrons and protons, and for the first time helium, carbon, and oxygen ions with LETd values of 1, 5.4, 14.4, 65, and 100.3keV/µm, respectively, delivered at mean dose rates of either 0.3-0.4Gy/s for SDR or approximately 100Gy/s for UHDR. The Oxylite (Oxford Optronics) system allowed measurements of the oxygen concentration before and after irradiation to calculate the oxygen consumption rate. The oxygen consumption rate was found to decrease with increasing LETd from 0.351mmHg/Gy for low LET electrons to 0.1796mmHg/Gy for high LET oxygen ions at SDR and for UHDR from 0.317 to 0.1556mmHg/Gy, respectively. A higher consumption rate for SDR irradiation compared to the corresponding UHDR irradiation persisted for all particle types. The measured consumption rates demonstrate a distinct LETd dependence. The obtained dataset, encompassing a wide range of LETd values, could serve as a benchmark for Monte Carlo simulations, which may aid in enhancing our comprehension of oxygen-related mechanisms after irradiations. Ultimately, they could help assess the viability of different hypotheses regarding UHDR sparing mechanisms and the FLASH effect. The found LETd dependence underscores the potential of heavy ion therapy, wherein elevated consumption rates in adjacent normal tissue offer protective benefits, while leaving tumor regions with generally higher "Linear Energy Transfer" (LET) vulnerable.
Read full abstract