Abstract

Positron emission tomography-computed tomography (PET/CT) is an essential imaging instrument for lymphoma diagnosis and prognosis. PET/CT image based automatic lymphoma segmentation is increasingly used in the clinical community. U-Net-like deep learning methods have been widely used for PET/CT in this task. However, their performance is limited by the lack of sufficient annotated data, due to the existence of tumor heterogeneity. To address this issue, we propose an unsupervised image generation scheme to improve the performance of another independent supervised U-Net for lymphoma segmentation by capturing metabolic anomaly appearance (MAA). Firstly, we propose an anatomical-metabolic consistency generative adversarial network (AMC-GAN) as an auxiliary branch of U-Net. Specifically, AMC-GAN learns normal anatomical and metabolic information representations using co-aligned whole-body PET/CT scans. In the generator of AMC-GAN, we propose a complementary attention block to enhance the feature representation of low-intensity areas. Then, the trained AMC-GAN is used to reconstruct the corresponding pseudo-normal PET scans to capture MAAs. Finally, combined with the original PET/CT images, MAAs are used as the prior information for improving the performance of lymphoma segmentation. Experiments are conducted on a clinical dataset containing 191 normal subjects and 53 patients with lymphomas. The results demonstrate that the anatomical-metabolic consistency representations obtained from unlabeled paired PET/CT scans can be helpful for more accurate lymphoma segmentation, which suggest the potential of our approach to support physician diagnosis in practical clinical applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.