Abstract
Chest CT is used in the COVID-19 diagnosis process as a significant complement to the reverse transcription polymerase chain reaction (RT–PCR) technique. However, it has several drawbacks, including long disinfection and ventilation times, excessive radiation effects, and high costs. While X-ray radiography is more useful for detecting COVID-19, it is insensitive to the early stages of the disease. We have developed inference engines that will turn X-ray machines into powerful diagnostic tools by using deep learning technology to detect COVID-19. We named these engines COV19-CNNet and COV19-ResNet. The former is based on convolutional neural network architecture; the latter is on residual neural network (ResNet) architecture. This research is a retrospective study. The database consists of 210 COVID-19, 350 viral pneumonia, and 350 normal (healthy) chest X-ray (CXR) images that were created using two different data sources. This study was focused on the problem of multi-class classification (COVID-19, viral pneumonia, and normal), which is a rather difficult task for the diagnosis of COVID-19. The classification accuracy levels for COV19-ResNet and COV19-CNNet were 97.61% and 94.28%, respectively. The inference engines were developed from scratch using new and special deep neural networks without pre-trained models, unlike other studies in the field. These powerful diagnostic engines allow for the early detection of COVID-19 as well as distinguish it from viral pneumonia with similar radiological appearances. Thus, they can help in fast recovery at the early stages, prevent the COVID-19 outbreak from spreading, and contribute to reducing pressure on health-care systems worldwide.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.