Abstract

Background: The effect of region of interest (ROI) size variation on producing accurate noise levels is not yet studied.Objective: This study aimed to evaluate the influence of ROI sizes on the accuracy of noise measurement in computed tomography (CT) by using images of a computational and American College of Radiology (ACR) phantoms.Material and Methods: In this experimental study, two phantoms were used, including computational and ACR phantoms. A computational phantom was developed by using Matlab R215a software (Mathworks Inc., Natick, MA Natick, MA) with a homogeneously +100 Hounsfield Unit (HU) value and an added-Gaussian noise with various levels of 5, 10, 25, 50, 75, and 100 HU. The ACR phantom was scanned with a Philips MX-16 slice CT scanner in different slice thicknesses of 1.5, 3, 5, and 7 mm to obtain noise variation. Noise measurement was conducted at the center of the phantom images and four locations close to the edge of the phantom images using different ROI sizes from 3 × 3 to 41 × 41 pixels, with an increased size of 2 × 2 pixels.Results: The use of a minimum ROI size of 21 × 21 pixels shows noise in the range of ± 5% ground truth noise. The measured noise increases above the ± 5% range if the used ROI is smaller than 21 × 21 pixels.Conclusion: A minimum acceptable ROI size is required to maintain the accuracy of noise measurement with a size of 21 × 21 pixels.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call