Abstract
Abstract We investigate the accuracy of direct attenuation correction (AC) in the image domain for myocardial perfusion SPECT (single-photon emission computed tomography) imaging (MPI-SPECT) using residual (ResNet) and UNet deep convolutional neural networks. MPI-SPECT 99mTc-sestamibi images of 99 patients were retrospectively included. UNet and ResNet networks were trained using non-attenuation-corrected SPECT images as input, whereas CT-based attenuation-corrected (CT-AC) SPECT images served as reference. Chang’s calculated AC approach considering a uniform attenuation coefficient within the body contour was also implemented. Clinical and quantitative evaluations of the proposed methods were performed considering SPECT CT-AC images of 19 subjects (external validation set) as reference. Image-derived metrics, including the voxel-wise mean error (ME), mean absolute error, relative error, structural similarity index (SSI), and peak signal-to-noise ratio, as well as clinical relevant indices, such as total perfusion deficit (TPD), were utilized. Overall, AC SPECT images generated using the deep learning networks exhibited good agreement with SPECT CT-AC images, substantially outperforming Chang’s method. The ResNet and UNet models resulted in an ME of −6.99 ± 16.72 and −4.41 ± 11.8 and an SSI of 0.99 ± 0.04 and 0.98 ± 0.05, respectively. Chang’s approach led to ME and SSI of 25.52 ± 33.98 and 0.93 ± 0.09, respectively. Similarly, the clinical evaluation revealed a mean TPD of 12.78 ± 9.22% and 12.57 ± 8.93% for ResNet and UNet models, respectively, compared to 12.84 ± 8.63% obtained from SPECT CT-AC images. Conversely, Chang’s approach led to a mean TPD of 16.68 ± 11.24%. The deep learning AC methods have the potential to achieve reliable AC in MPI-SPECT imaging.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.