Abstract
Accurately positioning and segmenting thymoma from computed tomography (CT) images is of great importance for an image-driven thymoma analysis. In clinical practice, the diagnosis and segmentation of thymomas for radiologists are time-consuming and inefficient tasks. Thus, it is necessary to develop a method to accurately and efficiently realize automatic segmentation of thymoma. Here, a dense skip connection encoding–decoding model (DSC-Net), which is a deep convolutional neural network, was proposed to perform automatic segmentation of thymoma with the ability to fuse feature maps under receptive fields of different scales. An image preprocessing method was also proposed to provide much more texture information and enhance the contrast between thymoma and its surrounding tissues. A total of 310 subjects who underwent contrast-enhanced CT scanning were included in this ethically-approved retrospective study. All of the CT slices were manually labeled by four experienced radiologists, and 80% of images were included in the training set and the rest were included in the testing set. The performance of segmentation was evaluated by calculating the accuracy, intersection over union (IoU), and Boundary F1 contour matching score (BFScore) between the predicted segmentation and the manual labels. For segmentation of thymoma in the testing set, the accuracy, IoU and BFScore were 92.96%, 87.86% and 0.9087 respectively. Compared to the U-Net method, the DSC-Net model improved IoU by 3.94%. In addition, the efficacy and robustness of DSC-Net in segmentation of different patients and different types of thymoma classified by the WHO histological classification criteria were verified. The proposed preprocessing method and DSC-Net demonstrated improved performance in segmentation of thymomas, suggesting the ability to provide consistent delineation and assist radiologists in their clinical applications.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.