Abstract
BackgroundSegmentation of neuroendocrine neoplasms (NENs) in [64Cu]Cu-DOTATATE positron emission tomography makes it possible to extract quantitative measures useable for prognostication of patients. However, manual tumor segmentation is cumbersome and time-consuming. Therefore, we aimed to implement and test an artificial intelligence (AI) network for tumor segmentation. Patients with gastroenteropancreatic or lung NEN with [64Cu]Cu-DOTATATE PET/CT performed were included in our training (n = 117) and test cohort (n = 41). Further, 10 patients with no signs of NEN were included as negative controls. Ground truth segmentations were obtained by a standardized semiautomatic method for tumor segmentation by a physician. The nnU-Net framework was used to set up a deep learning U-net architecture. Dice score, sensitivity and precision were used for selection of the final model. AI segmentations were implemented in a clinical imaging viewer where a physician evaluated performance and performed manual adjustments.ResultsCross-validation training was used to generate models and an ensemble model. The ensemble model performed best overall with a lesion-wise dice of 0.850 and pixel-wise dice, precision and sensitivity of 0.801, 0.786 and 0.872, respectively. Performance of the ensemble model was acceptable with some degree of manual adjustment in 35/41 (85%) patients. Final tumor segmentation could be obtained from the AI model with manual adjustments in 5 min versus 17 min for ground truth method, p < 0.01.ConclusionWe implemented and validated an AI model that achieved a high similarity with ground truth segmentation and resulted in faster tumor segmentation. With AI, total tumor segmentation may become feasible in the clinical routine.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.