Abstract

To evaluate the amount of energy deposited in radiosensitive organs and tissues of the human body, when an anthropomorphic phantom is irradiated, researchers in numerical dosimetry use the so-called exposure computational models (ECMs). One can imagine an ECM as a virtual scene composed of a phantom in a mathematically defined position in relation to a radioactive source. The source in these ECMs produces the initial state of the simulation: the position, direction, and energy with which each particle enters the phantom are essential variables. For subsequent states of a particle history, robust Monte Carlo (MC) codes are used. For the subsequent states of a particle's history, robust Monte Carlo (MC) codes are used, which simulate the average free path that the particle performs without interacting, its interaction with the atoms in the medium and the amount of energy deposited per interaction. MC codes also evaluate normalization quantities, so the results are printed in text files in the form of conversion coefficients between the absorbed dose and the selected normalization quantity. From the 2000s, the authors have published ECMs where a voxel phantom is irradiated by photons in the environment of the MC code EGSnrc (EGS = Electron Gamma Shower; nrc = National Research Council Canada). The production of articles, dissertations and theses required the use of specific computational tools, such as the FANTOMAS, DIP (Digital Image Processing) and Monte Carlo applications, for the various steps of numerical dosimetry, which ranges from the preparation of input files to the execution from the ECM to the organization and graphical and numerical analysis of the results. This article reviews computational phantoms for dosimetry mainly those produced in DEN-UFPE dissertations and thesis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call