Abstract

The advances in computed tomography (CT) technology have expanded the clinical applications of CT imaging. As a result the use of CT has been markedly increased over the recent years raising concerns about the stochastic risks of radiation and consequently safety in patient’s care. To comply with the “as low as reasonably achievable” principle, there is need to develop appropriate strategies to optimize CT examinations. CT manufacturers strive to develop techniques to reduce radiation dose while delivering tomographic images of diagnostic quality. Automatic exposure control systems and iterative reconstruction algorithms constitute the state of the art techniques for radiation dose optimization in CT. However, technological advancements have led to the introduction of new parameters that complicate CT examination protocols. Parameters that affect radiation dose and image quality in CT include quality reference tube current, quality reference image noise, tube voltage, quality reference contrast to noise ratio, beam width, pitch, length of z-overscan, reconstruction kernel, weight of blending between filtered back projection and iterative reconstruction algorithms. To optimize CT examination protocols, a basic understanding of CT scan parameters and their effect on image quality is required. CT radiation dose optimization is an important issue that needs to be addressed first by CT vendors and consequently by radiologists, medical physicists and radiologic technologists. In this presentation the basic principles of CT radiation exposure and the strategies followed for CT radiation dose optimization from the medical physicist’s perspective are reviewed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call