Abstract

In the Covid-19 era, it is important to have an edge detector for X-ray (XR) images affected by uncertainties with low computational load but with high performance. So, here, a new version of a well-known fuzzy edge detector, in which a new image fuzzification procedure has been formulated, is proposed. The performance were qualitatively/ quantitatively compared with those obtained by Canny’s edge detector (gold standard for this type of problem). In addition, an evolution of the deep fuzzy-neural model named CovNNet, recently proposed by the authors to discriminate chest XR (CXR) images of patients with Covid-19 pneumonia from images of patients with interstitial pneumonias not related to Covid-19 (No-Covid-19), is presented and referred as to Enhanced-CovNNet (ECovNNet). Here, the generalization ability of it is also improved by introducing a regularization based on dropping out some nodes of the network in a random way. ECovNNet processes input CXR images and the corresponding fuzzy CXR images (processed through the proposed enhanced-fuzzy edge detector) and extracts relevant CXR/fuzzy features, subsequently combined in a single array named CXR and fuzzy features vector. The latter is used as input to an Autoencoder-(AE)-based classifier to perform the binary classification: Covid-19 and No-Covid-19, reporting accuracy rate up to 81%. Finally, the work is completed with some interesting physico-mathematical results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.