Abstract

The landscape of prostate cancer (PCa) segmentation within multiparametric magnetic resonance imaging (MP-MRI) was fragmented, with a noticeable lack of consensus on incorporating background details, culminating in inconsistent segmentation outputs. Given the complex and heterogeneous nature of PCa, conventional imaging segmentation algorithms frequently fell short, prompting the need for specialized research and refinement. This study sought to dissect and compare various segmentation methods, emphasizing the role of background information and gland masks in achieving superior PCa segmentation. The goal was to systematically refine segmentation networks to ascertain the most efficacious approach. A cohort of 232 patients (ages 61-73 years old, prostate-specific antigen: 3.4-45.6ng/mL), who had undergone MP-MRI followed by prostate biopsies, was analyzed. An advanced segmentation model, namely Attention-Unet, which combines U-Net with attention gates, was employed for training and validation. The model was further enhanced through a multiscale module and a composite loss function, culminating in the development of Matt-Unet. Performance metrics included Dice Similarity Coefficient (DSC) and accuracy (ACC). The Matt-Unet model, which integrated background information and gland masks, outperformed the baseline U-Net model using raw images, yielding significant gains (DSC: 0.7215vs. 0.6592; ACC: 0.8899vs. 0.8601, p<0.001). A targeted and practical PCa segmentation method was designed, which could significantly improve PCa segmentation on MP-MRI by combining background information and gland masks. The Matt-Unet model showcased promising capabilities for effectively delineating PCa, enhancing the precision of MP-MRI analysis.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.