Abstract

Automatic lung segmentation is an essential step towards the Computer-Aided Diagnosis of the lung CT scan. However, in presence of dense abnormalities, existing methods fail in accurate lung segmentation. In this paper, a generative adversarial networks-based approach is proposed for improving the accuracy of lung segmentation. The proposed network effectively segments the lung region from the surrounding chest region hence, named as LungSeg-Net. In the proposed LungSeg-Net, the input lung CT slices are processed through the trail of encoders which encode these slices into a set of feature maps. Further, a multi-scale dense-feature extraction (MSDFE) module is designed for extraction of multi-scale features from the set of encoded feature maps. Finally, the decoders are employed to obtain the lung segmentation map from the multi-scale features. The MSDFE makes the network to learn the relevant features of dense abnormalities whereas the iterative down-sampling followed by the up-sampling makes it invariant to the size of the dense abnormality. The publicly available benchmark ILD dataset is used for the experimental analysis. The qualitative and quantitative analysis has been carried out to compare the performance of the proposed network with the existing state-of-the-art methods for lung segmentation. The experimental analysis show that the performance of the proposed LungSeg-Net is invariant to the presence of dense abnormalities in lung CT scan.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.