Abstract
Our main objective is to develop a novel deep learning-based algorithm for automatic segmentation of prostate zone and to evaluate the proposed algorithm on an additional independent testing data in comparison with inter-reader consistency between two experts. With IRB approval and HIPAA compliance, we designed a novel convolutional neural network (CNN) for automatic segmentation of the prostatic transition zone (TZ) and peripheral zone (PZ) on T2-weighted (T2w) MRI. The total study cohort included 359 patients from two sources; 313 from a deidentified publicly available dataset (SPIE-AAPM-NCI PROSTATEX challenge) and 46 from a large U.S. tertiary referral center with 3T MRI (external testing dataset (ETD)). The TZ and PZ contours were manually annotated by research fellows, supervised by genitourinary (GU) radiologists. The model was developed using 250 patients and tested internally using the remaining 63 patients from the PROSTATEX (internal testing dataset (ITD)) and tested again (n=46) externally using the ETD. The Dice Similarity Coefficient (DSC) was used to evaluate the segmentation performance. DSCs for PZ and TZ were 0.74 and 0.86 in the ITD respectively. In the ETD, DSCs for PZ and TZ were 0.74 and 0.792, respectively. The inter-reader consistency (Expert 2 vs. Expert 1) were 0.71 (PZ) and 0.75 (TZ). This novel DL algorithm enabled automatic segmentation of PZ and TZ with high accuracy on both ITD and ETD without a performance difference for PZ and less than 10% TZ difference. In the ETD, the proposed method can be comparable to experts in the segmentation of prostate zones.
Highlights
Prostate cancer (PCa) is the most common solid noncutaneous cancer in American men [1]
The Prostate Imaging Reporting and Data System version 2.1 (PI-RADSv2.1), an expert guideline for performance and interpretation of Multiparametric MRI (mpMRI) for PCa detection, [4], [5], T2 and diffusion weighted imaging (DWI) images are used for primary interpretation of lesions in the peripheral zone (PZ) and transition zone (TZ) respectively for assigning a PI-RADS score to lesions detected on mpMRI. [6] A robust method for reproducible, automatic segmentation of prostate zones (ASPZ) may enable the consistent assignment
For the PROSTATEX data, both TZ and PZ were segmented in OsiriX (Pixmeo SARL, Bernex, Switzerland) by two MRI research fellows, where the contours were later cross-checked by both genitourinary (GU) radiologists (10-15 years of post-fellowship experience interpreting over 1,000 prostate mpMRI) and clinical research fellows
Summary
Prostate cancer (PCa) is the most common solid noncutaneous cancer in American men [1]. [6] A robust method for reproducible, automatic segmentation of prostate zones (ASPZ) may enable the consistent assignment. Atlas based methods were previously proposed to segment the prostate zones [8]. Semantic information captured by U-Net may not be sufficient to describe the heterogeneous anatomic structures of the prostate and indiscernible borders between TZ and PZ, resulting in inconsistent and sub-optimal ASPZ performance. We propose a new DL based method for automatic segmentation of prostate zones by developing a fully CNN with a novel feature pyramid attention mechanism. The proposed CNN consisted of three sub-networks, comprised of an improved deep residual network (based on the ResNet50) [14], a pyramid feature network with attention [15], and a decoder. We compared the proposed method with inter-reader consistency using two independent expert based manual segmentations
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.