Abstract

Previously developed MR-based three-dimensional (3D) Fricke-xylenol orange (FXG) dosimeters can provide end-to-end quality assurance and validation protocols for pre-clinical radiation platforms. FXG dosimeters quantify ionizing irradiation induced oxidation of Fe2+ ions using pre- and post-irradiation MR imaging methods that detect changes in spin-lattice relaxation rates (R1 = ) caused by irradiation induced oxidation of Fe2+. Chemical changes in MR-based FXG dosimeters that occur over time and with changes in temperature can decrease dosimetric accuracy if they are not properly characterized and corrected. This paper describes the characterization, development and utilization of an empirical model-based correction algorithm for time and temperature effects in the context of a pre-clinical irradiator and a 7 T pre-clinical MR imaging system.Time and temperature dependent changes of R1 values were characterized using variable TR spin-echo imaging. R1-time and R1-temperature dependencies were fit using non-linear least squares fitting methods. Models were validated using leave-one-out cross-validation and resampling. Subsequently, a correction algorithm was developed that employed the previously fit empirical models to predict and reduce baseline R1 shifts that occurred in the presence of time and temperature changes. The correction algorithm was tested on R1-dose response curves and 3D dose distributions delivered using a small animal irradiator at 225 kVp.The correction algorithm reduced baseline R1 shifts from −2.8 × 10−2 s−1 to 1.5 × 10−3 s−1. In terms of absolute dosimetric performance as assessed with traceable standards, the correction algorithm reduced dose discrepancies from approximately 3% to approximately 0.5% (2.90 ± 2.08% to 0.20 ± 0.07%, and 2.68 ± 1.84% to 0.46 ± 0.37% for the 10 × 10 and 8 × 12 mm2 fields, respectively).Chemical changes in MR-based FXG dosimeters produce time and temperature dependent R1 values for the time intervals and temperature changes found in a typical small animal imaging and irradiation laboratory setting. These changes cause baseline R1 shifts that negatively affect dosimeter accuracy. Characterization, modeling and correction of these effects improved in-field reported dose accuracy to less than 1% when compared to standardized ion chamber measurements.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.