Abstract

Microdosimetric spectra of single event distributions have been used to provide estimates of quality factors for radiation protection of high-LET radiation. In situations with high-dose rates it becomes difficult to measure, record and store energy deposition from single events. An alternative approach is to store random energy deposition events in a sequence of fixed time intervals that does not require identifying from single events. This can be accomplished with a single detector without pulse height analysis. We show the development of the algorithm using expectation analysis of the statistical estimators for moments of lineal energy: ȳf and ȳD. The method was tested using Monte Carlo simulations based on single event distributions measured with spherical tissue equivalent proportional counters where the event sizes spanned more than two orders of magnitude. The evaluation included testing at various mean numbers of events per interval (i.e., dose rate) and numbers of intervals (i.e., total duration). Results of the expectation analysis and Monte Carlo simulation showed that the algorithm corrects for the excess dispersion due to the random number of events in each time interval when the underlying dose rate is constant. It also converges to the correct value when there is a linear trend in dose rate of the duration of the measurement process. Although this system is not applicable for pulsed radiation fields it proved to be robust when applied to measured distributions with single event spectra (PuBe neutrons, Fe ions at 1,000 MeV/nucleon and a power function distribution of single event sizes) with a coefficient of variation of 25% for estimates of ȳD using 100 sampling intervals and 10% using 400 sampling intervals.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call