Abstract

We propose a hybrid deep learning-based method, which includes a cycle consistent generative adversarial network (CycleGAN) and deep attention fully convolution network implemented by a U-Net (DAUnet), to perform volumetric multi-organ segmentation for pelvic computed tomography (CT). The proposed method first utilized CycleGAN to generate synthetic MRI (sMRI) to provide superior soft tissue contrast. Then, the proposed method fed the sMRI into the DAUnet to obtain the volumetric segmentation of bladder, prostate and rectum, simultaneously, via a multi-channel output. The deep attention strategy was introduced to retrieve the most relevant features to identify organ boundaries. Deep supervision was incorporated into the DAUnet to enhance the features’ discriminative ability. Segmented contours of a patient were obtained by feeding the CT image into the trained CycleGAN to generate sMRI, which was then fed to the trained DAUnet to generate the organ contours. A retrospective studied was performed with data sets from 45 patients with prostate cancer. The Dice similarity coefficient and mean surface distance indices for bladder, prostate, and rectum contours were 0.94, 0.47 mm; 0.86, 0.78 mm; and 0.89, 0.85 mm, respectively. The proposed network provides accurate and consistent prostate, bladder and rectum segmentation without the need of additional MRIs. With further evaluation and clinical implementation, this method has the potential to facilitate routine prostate-cancer radiotherapy treatment planning.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.