Abstract
An important phase of radiation treatment planning is the accurate contouring of the organs at risk (OAR), which is necessary for the dose distribution calculation. The manual contouring approach currently used in clinical practice is tedious, time-consuming, and prone to inter and intra-observer variation. Therefore, a deep learning-based auto contouring tool can solve these issues by accurately delineating OARs on the computed tomography (CT) images. This paper proposes a two-stage deep learning-based segmentation model with an attention mechanism that automatically delineates OARs in thoracic CT images. After preprocessing the input CT volume, a 3D U-Net architecture will locate each organ to generate cropped images for the segmentation network. Next, two differently configured U-Net-based networks will perform the segmentation of large organs-left lung, right lung, heart, and small organs-esophagus and spinal cord, respectively. A post-processing step integrates all the individually-segmented organs to generate the final result. The suggested model outperformed the state-of-the-art approaches in terms of dice similarity coefficient (DSC) values for the lungs and the heart. It is worth mentioning that the proposed model achieved a dice score of 0.941, which is 1.1% higher than the best previous dice score, in the case of the heart, an important organ in the human body. Moreover, the clinical acceptance of the results is verified using dosimetric analysis. To delineate all five organs on a CT scan of size [Formula: see text], our model takes only 8.61s. The proposed open-source automatic contouring tool can generate accurate contours in minimal time, consequently speeding up the treatment time and reducing the treatment cost.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.