Abstract
Automatic and accurate thorax disease diagnosis in Chest X-ray (CXR) image plays an essential role in clinical assist analysis. However, due to its imaging noise regions and the similarity of visual features between diseases and their surroundings, the precise analysis of thoracic disease becomes a challenging problem. In this study, we propose a novel knowledge-guided deep zoom neural network (KGZNet) which is a data-driven model. Our approach leverage prior medical knowledge to guide its training process, due to thoracic diseases typically limit within the lung regions. Also, we utilized weakly-supervised learning (WSL) to search for finer regions without using annotated samples. Learning on each scale consists of a classification sub-network. The KGZNet starts from global images, and iteratively generates discriminative part from coarse to fine; while a finer scale sub-network takes as input an amplified attended discriminative region from previous scales in a recurrent way. Specifically, we first train a robust modified U-Net model of lung segmentation and capture the lung area from the original CXR image through the Lung Region Generator. Then, guided by the attention heatmap, we obtain a finer discriminative lesion region from the lung region images by the Lesion Region Generator. Lastly, the most discriminative features knowledge is fused, and the complementary features information is learned for final disease prediction. Extensive experiments demonstrate that our method can effectively leverage discriminative region information, and significantly outperforms the other state-of-the-art methods in the thoracic disease recognition task. Furthermore, the proposed KGZNet can gradually learn the discriminative region from coarse to fine in a mutually reinforced way. The code is will available at: https://github.com/ISSE-AILab/KGZNet.
Highlights
CHEST X-ray (CXR) is one of the most common types of medical imaging examinations globally, effectively assisting the clinical diagnosis and treatment for a series of thoracic diseases, including lung cancer, tuberculosis, and pneumonia, etc
DATASET AND IMPLEMENTATION DETAILS 1) DATASET In our study, the proposed method was validated on the NIH Chest X-141 dataset [16], which is a widely used touchstone for multi-label thoracic disease classification in CXR images
19,894 samples are labeled with ‘‘infiltration,’’ while only 227 samples are labeled with ‘‘Hernia.’’ the dataset contains 984 ground truth bounding boxes annotated by board-certified radiologists for 880 CXR images
Summary
CHEST X-ray (CXR) is one of the most common types of medical imaging examinations globally, effectively assisting the clinical diagnosis and treatment for a series of thoracic diseases, including lung cancer, tuberculosis, and pneumonia, etc. The main extension includes: (1) the ablation study of different scale training strategies, (2) discussion of various parameters of the network in detail, (3) network training strategies and algorithm implementation, (4) the performance of lung segmentation and lesion localization, and (5) discussion of the limitation of the proposed method, and verify the robustness of the method on another Chest X-rays dataset. We propose to learn the most discriminative areas guided by the prior clinical knowledge and finer region-based feature representation on the multi-scale branches. Compared with the previous work which relies on bounding box annotations to localize the discriminative regions, our method only utilizes the medical domain knowledge-guided and image-level labels. This way can effectively reduce human resources and improve diagnostic efficiency in practical application. The feature fusion module concatenates the global average pooling layers of three different scope feature extractors and is fine-turned for the final thoracic disease classification
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.