Abstract

One of the emerging issues in radiography is low-dose imaging to minimize patient’s exposure. The scintillating materials employed in most indirect flat-panel detectors show a drastic change of X-ray photon absorption efficiency around their K-edge energies that consequently affects image quality. Using various tube voltages, we investigated the imaging performance of most popular scintillators: cesium iodide (CsI) and gadolinium oxysulfide (Gd2O2S). The integrated detective quantum efficiencies (iDQE) of four detectors installed in the same hospital were evaluated according to the standardized procedure IEC 62220-1 at tube voltages of 40 - 120 kVp. The iDQE values of the Gd2O2S detectors were normalized by those of CsI detectors to exclude the effects of image postprocessing. The contrast-to-noise ratios (CNR) were also evaluated by using an anthropomorphic chest phantom. The iDQE of the CsI detector outperformed that of the Gd2O2S detector over all tube voltages. Moreover, we noted that the iDQE of the Gd2O2S detectors quickly rolled off with decreasing tube voltage under 70 kVp. The CNRs of the two scintillators were similar at 120 kVp. At 60 kVp, however, the CNR of Gd2O2S was about half that of CsI. Compared to the Gd2O2S detectors, variations in the DQE performance of the CsI detectors were relatively immune to variations in the applied tube voltages. Therefore, we claim that Gd2O2S detectors are inappropriate for use in low-tube-voltage imaging (e.g., extremities and pediatrics) with low patient exposure.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call