Abstract

Multiparametric magnetic resonance imaging (mp-MRI) has shown excellent results in the detection of prostate cancer (PCa). However, characterizing prostate lesions aggressiveness in mp-MRI sequences is impossible in clinical practice, and biopsy remains the reference to determine the Gleason score (GS). In this work, we propose a novel end-to-end multi-class network that jointly segments the prostate gland and cancer lesions with GS group grading. After encoding the information on a latent space, the network is separated in two branches: 1) the first branch performs prostate segmentation 2) the second branch uses this zonal prior as an attention gate for the detection and grading of prostate lesions. The model was trained and validated with a 5-fold cross-validation on a heterogeneous series of 219 MRI exams acquired on three different scanners prior prostatectomy. In the free-response receiver operating characteristics (FROC) analysis for clinically significant lesions (defined as GS>6) detection, our model achieves 69.0%±14.5% sensitivity at 2.9 false positive per patient on the whole prostate and 70.8%±14.4% sensitivity at 1.5 false positive when considering the peripheral zone (PZ) only. Regarding the automatic GS group grading, Cohen's quadratic weighted kappa coefficient (κ) is 0.418±0.138, which is the best reported lesion-wise kappa for GS segmentation to our knowledge. The model has encouraging generalization capacities with κ=0.120±0.092 on the PROSTATEx-2 public dataset and achieves state-of-the-art performance for the segmentation of the whole prostate gland with a Dice of 0.875±0.013. Finally, we show that ProstAttention-Net improves performance in comparison to reference segmentation models, including U-Net, DeepLabv3+ and E-Net. The proposed attention mechanism is also shown to outperform Attention U-Net.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.