Abstract

This note investigates temperature effects on dosimetry using a metal oxide semiconductor field effect transistor (MOSFET) for radiotherapy x-ray treatment. This was performed by analysing the dose response and threshold voltage outputs for MOSFET dosimeters as a function of ambient temperature. Results have shown that the clinical semiconductor dosimetry system (CSDS) MOSFET provides stable dose measurements with temperatures varying from 15 °C up to 40 °C. Thus standard irradiations performed at room temperature can be directly compared to in vivo dose assessments performed at near body temperature without a temperature correction function. The MOSFET dosimeter threshold voltage varies with temperature and this level is dependent on the dose history of the MOSFET dosimeter. However, the variation can be accounted for in the measurement method. For accurate dosimetry, the detector should be placed for approximately 60 s on a patient to allow thermal equilibrium before measurements are taken with the final reading performed whilst still attached to the patient or conversely left for approximately 120 s after removal from the patient if initial readout was measured at room temperature to allow temperature equilibrium to be established.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call