Abstract

Cyclotron production of Tc through the 100Mo(p,2n)Tc reaction channel is actively being investigated as an alternative to reactor-based 99Mo generation by nuclear fission of 235U. Like most radioisotope production methods, cyclotron production of Tc will result in creation of unwanted impurities, including Tc and non-Tc isotopes. It is important to measure the amounts of these impurities for release of cyclotron-produced Tc (CPTc) for clinical use. Detection of radioactive impurities will rely on measurements of their gamma (γ) emissions. Gamma spectroscopy is not suitable for this purpose because the overwhelming presence of Tc and the count-rate limitations of γ spectroscopy systems preclude fast and accurate measurement of small amounts of impurities. In this article we describe a simple and fast method for measuring γ emission rates from radioactive impurities in CPTc. The proposed method is similar to that used to identify 99Mo breakthrough in generator-produced Tc: one dose calibrator (DC) reading of a CPTc source placed in a lead shield is followed by a second reading of the same source in air. Our experimental and theoretical analysis show that the ratio of DC readings in lead to those in air are linearly related to γ emission rates from impurities per MBq of Tc over a large range of clinically-relevant production conditions. We show that estimates of the γ emission rates from Tc impurities per MBq of Tc can be used to estimate increases in radiation dose (relative to pure Tc) to patients injected with CPTc-based radiopharmaceuticals. This enables establishing dosimetry-based clinical-release criteria that can be tested using commercially-available dose calibrators. We show that our approach is highly sensitive to the presence of Tc, Tc, Tc, Tc, Tc, Tc, and Tc, in addition to a number of non-Tc impurities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call