Abstract

The accurate segmentation of pancreatic subregions (head, body, and tail) in CT images provides an opportunity to examine the local morphological and textural changes in the pancreas. Quantifying such changes aids in understanding the spatial heterogeneity of the pancreas and assists in the diagnosis and treatment planning of pancreatic cancer. Manual outlining of pancreatic subregions is tedious, time-consuming, and prone to subjective inconsistency. This paper presents a multistage anatomy-guided framework for accurate and automatic 3D segmentation of pancreatic subregions in CT images. Using the delineated pancreas, two soft-label maps were estimated for subregional segmentation—one by training a fully supervised naïve Bayes model that considers the length and volumetric proportions of each subregional structure based on their anatomical arrangement, and the other by using the conventional deep learning U-Net architecture for 3D segmentation. The U-Net model then estimates the joint probability of the two maps and performs optimal segmentation of subregions. Model performance was assessed using three datasets of contrast-enhanced abdominal CT scans: one public NIH dataset of the healthy pancreas, and two datasets D1 and D2 (one for each of pre-cancerous and cancerous pancreas). The model demonstrated excellent performance during the multifold cross-validation using the NIH dataset, and external validation using D1 and D2. To the best of our knowledge, this is the first automated model for the segmentation of pancreatic subregions in CT images. A dataset consisting of reference anatomical labels for subregions in all images of the NIH dataset is also established.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.