Abstract

The existence of applicator makes the organ segmentation very difficult on brachytherapy CT images. To address this issue, we developed a novel deep learning model to accurately segment organs at risk from brachytherapy CT images. We developed a novel super perception convolutional neural network (SPNet) for organ segmentation from cervical cancer brachytherapy CT images. SPNet is constructed based on a classical pyramidal structure, with the double convolution block replaced by a super perception (SP) block, that combines the dilation convolution and inception convolution. The SPNet with same scale parameters has over four times perception filed than a general pyramidal network, such as the UNet. We collected 90 CT images of 90 cervical cancer patients treated by brachytherapy from our institution to train and evaluate the proposed method. Three OARs: bladder, rectum and sigmoid with manually delineated, contours were checked for quality assurance by radiotherapy oncologist, and used as the ground truth. The proposed SPNet was trained on randomly selected 58 patients, internally validated on 10 patients and tested on all the rest. The segmentation accuracy is quantitatively evaluated by the Dice similarity coefficient, and compared with segmentation results from a UNet implementation. Our proposed method achieved Dice values of 91.4 (±2.0)%, 82.4 (±6.0)% and 75.4 (±8.9)% for bladder, rectum, and sigmoid, respectively. For comparison, the Unet achieved a Dice value of 75.1 (±16.1)%, 62.6 (±16.6)%, and 56.3 (±17.8)%, respectively. The quantitative evaluation results demonstrate the proposed method is accurate in OARs segmentation for cervical cancer brachytherapy, and also demonstrate that our model outperforms the general pelvic OARs segmentation model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call