Objective.The introduction of automatic tube current modulation (ATCM) has resulted in complex relationships between scanner parameters, patient body habitus, radiation dose, and image quality. ATCM adjusts tube current based on x-ray attenuation variations in the scan region, and overall patient dose depends on a combination of factors. This work aims to develop mathematical models that predict CT radiation dose and image noise in terms of attenuating diameter and all relevant scanner parameters.Approach.A homogenous phantom, equipped with the features to conduct discrete and continuous adaption tests, was developed to model ATCM in a Philips CT scanner. Scanner parameters were varied based on theoretical dose relationships, and a MATLAB script was developed to extract data from DICOM images. R statistical software was employed for data analysis, plotting, and regression modelling.Main Results.Phantom data provided the following insights: Median tube current decreased by 81% as tube potential varied from 80 kVp to 140 kVp. Doubling the DoseRight Index (DRI) from 12 to 24, at 24 cm diameter, produced a 294% increase in mA and a 46% decrease in noise. Mean mA increased by 53% whilst mean noise increased by 5.7% as helical pitch increased from 0.6 to 0.925. Changing rotation time from 0.33s to 0.75s gave a 56% reduction in mean mA and no change in image noise. Increasing detector collimation (n×T) resulted in higher tube currents and lower output image noise values, asnandTwere varied independently. Interpreting these results to apply transformations relevant to each independent variable produced models for tube current and noise with adjusted R-squared values of 0.965 and 0.912, respectively.Significance.The models developed more accurately predict radiation dose and image quality for specific patients and scanner settings. They provide imaging professionals with a practical tool to optimize scan protocols according to patient diameters and clinical objectives.
Read full abstract