Thermoluminescence (TL) dosimetry relies on evaluating the dose absorbed in the TL detector by measuring the light output by the detector, i.e. by the TL glow-curve analysis. However, the absolute efficiency of the TL light emission per unit dose of ionizing radiation absorbed in the detector is known to depend on the energy and quality (ionization density) of this radiation. Moreover, as the TL light is absorbed in the detector itself, the spatial distribution of energy deposition events inside the detector also needs to be considered. It is convenient to describe the response of the detector (TL output per unit dose) relative to that after a dose of sparsely ionizing reference radiation, such as 137Csγ-rays, via relative efficiency, ηiγ, defined as the TL light signal emitted by the TL detector per unit imparted energy of radiation of the type i, normalized to the signal per unit imparted energy of this reference radiation. Microdosimetric models have provided an insight as to the variation of ηiγ, with the energy and ionization density, related to the spatial distribution of ionizations and excitations produced by the ionizing radiation in the detector, as well as some experimental factors related to the TL light transport within the detector. To study the variation of ηiγ with LET in LiF:Mg, Ti detectors irradiated by heavy charged particles (high-LET radiation), the most successful approach was the track structure model, based on the radial distribution of dose (RDD) around the ion tracks. For low-LET radiation (photons, electrons) the microdosimetric model has been successfully applied to predict ηiγ for LiF:Mg,Ti, LiF:Mg,Cu,P, and CaF2:Tm TL detectors, to explain the discrepancy between the measured and predicted photon-energy response of these detectors.
Read full abstract