Chest radiography is a comparatively low-cost, most probably applied in a medical process that brings crucial information to discover diagnostic conclusions. Chest X-rays (CXR) are used in the prognosis of chest diseases that includes asthma, lung cancer, pneumonia, and COVID-19. Artificial intelligence detects the multi-disease automatically to enhance its proficiency and performance by solving image detection complications with various machine learning and deep learning approaches. Convolutional Neural Network (CNN) has been designed for the advancement of computerized recognition systems. For the classification of medical images, texture, shape, size, and tissue constitution are essential features for detecting diseases. Hence, huge input features are merged with deep CNN validated to increase the effectiveness of CXR analysis. In addition, multi-scale features are visualized in CNN for the detection of variable sizes of thoracic diseases. This paper has implemented a novel multi-disease diagnosis model using chest X-ray images through deep learning approaches, and thus, it helps to minimize the computational cost. The standard CXR images are gathered from the standard datasets. After, the pre-processing is performed for cleaning and contrast the images. Further, the image segmentation is carried out using Optimized DeepLabv3 (ODeepLabv3), where the optimization of parameters in DeepLabv3 is done by a new Mutation Rate-based Lion Algorithm (MR-LA). Then, multi-disease classification is carried out through Optimized Ensemble Transfer Learning (OETL), where the OETL is carried out through VGG16, ResNet, ImageNet, MobileNet, GoogleNet, Inception, and Xception. Here, the parameter optimization of all models is done by the same MR-LA. The proposed model improved the effectiveness and performance in expressions of sensitivity, precision, and specificity factors and get higher classification and detection accuracy. The outcome of the recommended method over other models is identified by the relative analysis conducted.
Read full abstract