Abstract

Abstract Objective To introduce an end-to-end automatic segmentation method for organs at risk (OARs) in chest computed tomography (CT) images based on dense connection deep learning and to provide an accurate auto-segmentation model to reduce the workload on radiation oncologists. Methods CT images of 36 lung cancer cases were included in this study. Of these, 27 cases were randomly selected as the training set, six cases as the validation set, and nine cases as the testing set. The left and right lungs, cord, and heart were auto-segmented, and the training time was set to approximately 5 h. The testing set was evaluated using geometric metrics including the Dice similarity coefficient (DSC), 95% Hausdorff distance (HD95), and average surface distance (ASD). Thereafter, two sets of treatment plans were optimized based on manually contoured OARs and automatically contoured OARs, respectively. Dosimetric parameters including Dmax and Vx of the OARs were obtained and compared. Results The proposed model was superior to U-Net in terms of the DSC, HD95, and ASD, although there was no significant difference in the segmentation results yielded by both networks (P > 0.05). Compared to manual segmentation, auto-segmentation significantly reduced the segmentation time by nearly 40.7% (P < 0.05). Moreover, the differences in dose-volume parameters between the two sets of plans were not statistically significant (P > 0.05). Conclusion The bilateral lung, cord, and heart could be accurately delineated using the DenseNet-based deep learning method. Thus, feature map reuse can be a novel approach to medical image auto-segmentation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.