Abstract
Segmentation of the prostate bed, the residual tissue after the removal of the prostate gland, is an essential prerequisite for post-prostatectomy radiotherapy but also a challenging task due to its non-contrast boundaries and highly variable shapes relying on neighboring organs. In this work, we propose a novel deep learning-based method to automatically segment this "invisible target". As the main idea of our design, we expect to get reference from the surrounding normal structures (bladder&rectum) and take advantage of this information to facilitate the prostate bed segmentation. To achieve this goal, we first use a U-Net as the backbone network to perform the bladder&rectum segmentation, which serves as a low-level task that can provide references to the high-level task of the prostate bed segmentation. Based on the backbone network, we build a novel attention network with a series of cascaded attention modules to further extract discriminative features for the high-level prostate bed segmentation task. Since the attention network has one-sided dependency on the backbone network, simulating the clinical workflow to use normal structures to guide the segmentation of radiotherapy target, we name the final composition model asymmetrical multi-task attention U-Net. Extensive experiments on a clinical dataset consisting of 186 CT images demonstrate the effectiveness of this new design and the superior performance of the model in comparison to the conventional atlas-based methods for prostate bed segmentation. The source code is publicly available at https://github.com/superxuang/amta-net.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.