Abstract

Background: In this study, a deep convolutional neural network (CNN)-based automatic segmentation technique was applied to multiple organs at risk (OARs) depicted in computed tomography (CT) images of lung cancer patients, and the results were compared with those generated through atlas-based automatic segmentation.Materials and methods: An encoder-decoder U-Net neural network was produced. The trained deep CNN performed the automatic segmentation of CT images for 36 cases of lung cancer. The Dice similarity coefficient (DSC), the mean surface distance (MSD) and the 95% Hausdorff distance (95% HD) were calculated, with manual segmentation results used as the standard, and were compared with the results obtained through atlas-based segmentation.Results: For the heart, lungs and liver, both the deep CNN-based and atlas-based techniques performed satisfactorily (average values: 0.87 < DSC < 0.95, 1.8 mm < MSD < 3.8 mm, 7.9 mm < 95% HD <11 mm). For the spinal cord and the oesophagus, the two methods had statistically significant differences. For the atlas-based technique, the average values were 0.54 < DSC < 0.71, 2.6 mm < MSD < 3.1 mm and 9.4 mm < 95% HD <12 mm. For the deep CNN-based technique, the average values were 0.71 < DSC < 0.79, 1.2 mm < MSD <2.2 mm and 4.0 mm < 95% HD < 7.9 mm.Conclusion: Our results showed that automatic segmentation based on a deep convolutional neural network enabled us to complete automatic segmentation tasks rapidly. Deep convolutional neural networks can be satisfactorily adapted to segment OARs during radiation treatment planning for lung cancer patients.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.