Abstract

Purpose: Dose and dose rate are both appropriate for estimating risk from internally deposited radioactive materials. We investigated the role of dose rate on lung cancer induction in Beagle dogs following a single inhalation of strontium-90 (90Sr), cerium-144 (144Ce), yttrium-91 (91Y), or yttrium-90 (90Y). As retention of the radionuclide is dependent on biological clearance and physical half-life a representative quantity to describe this complex changing dose rate is needed.Materials and methods: Data were obtained from Beagle dog experiments from the Inhalation Toxicology Research Institute. The authors selected the dose rate at the effective half-life of each radionuclide (DRef).Results: Dogs exposed to DRef (1–100 Gy/day) died within the first year after exposure from acute lung disease. Dogs exposed at lower DRef (0.1–10 Gy/day) died of lung cancer. As DRef decreased further (<0.1 Gy/day 90Sr, <0.5 Gy/day 144Ce, <0.9 Gy/day 91Y, <8 Gy/day 90Y), survival and lung cancer frequency were not significantly different from control dogs.Conclusion: Radiation exposures resulting from inhalation of beta-gamma emitting radionuclides that decay at different rates based on their effective half-life, leading to different rates of decrease in dose rate and cumulative dose, is less effective in causing cancer than acute low linear energy transfer exposures of the lung.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call