Abstract

Clinical PET/CT examinations rely on CT modality for anatomical localization and attenuation correction of the PET data. However, the use of CT significantly increases the risk of ionizing radiation exposure for patients. We propose a deep learning framework to learn the relationship mapping between attenuation corrected (AC) PET and non-attenuation corrected (NAC) PET images to estimate PET attenuation maps and generate pseudo-CT images for medical observation. In this study, 5760, 1608 and 1351 pairs of transverse PET-CT slices were used as the training, validation, and testing sets, respectively, to implement the proposed framework. A pix2pix model was adopted to predict AC PET images from NAC PET images, which allowed the calculation of PET attenuation maps (µ-maps). The same model was then applied to generate realistic CT images from the calculated µ-maps. The quality of predicted AC PET and CT was assessed using normalized root mean square error (NRMSE), peak signal-to-noise ratio (PSNR), structural similarity index (SSIM) and Pearson correlation coefficient (PCC). Relative to true AC PET, the synthetic AC PET achieved superior quantitative performances with 2.20 ± 1.17% NRMSE, 34.03 ± 4.73 dB PSNR, 97.90 ± 1.22% SSIM and 98.45 ± 1.31% PCC. The synthetic CT and synthetic AC PET images were deemed acceptable by radiologists who rated the images, as they provided sufficient anatomical and functional information, respectively. This work demonstrates that the proposed deep learning framework is a promising method in clinical applications, such as radiotherapy and low-dose imaging.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.