Abstract

The global spread of COVID-19 is seriously harming both public health and the global economy. Faster and more precise symptom identification, which is the motivating focus of this study, is one of the most crucial steps in treating COVID-19 patients. In this investigation, we evaluate the performance of three pre-trained CNN (PCNN) methods for detection from chest X-rays: ResNet152, DenseNet121, and InceptionV3. 2,905 chest x-rays from our dataset were split into three groups: those with COVID-19 infection (132 instances), viral pneumonia (90 cases), and unaffected chest x-rays (184 cases). Utilizing deep learning techniques, our primary goal was to identify structural anomalies that would allow us to differentiate between COVID-19 pneumonia cases that were positive, normal, and viral. The three PCNN models were implemented, and each model was trained both with and without transfer learning. DenseNet121 delivered the best results, with training, validation, and test accuracy of 99.78%, 94.00%, and 96.42%, respectively. With an overall accuracy of 96.42 percent on the test dataset, 99.78 percent on the experimental dataset, training, and 94.00 percent on the validation dataset, DenseNet121, one of the constructed PCNNs, demonstrated respectable performance in categorizing various situations. The model did well at identifying COVID-19 instances in the test dataset (precision=96.05, sensitivity=96.83, and kappa=94.50), which is more relevant. As a result, DenseNet121, one of the PCNNs described in this article, may be used as a trustworthy approach for the quicker and more precise identification of COVID-19 instances.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.