Abstract

COVID-19 is a highly infectious disease with high morbidity and mortality, requiring tools to support rapid triage and risk stratification. In response, deep learning has demonstrated great potential to quicklyand autonomously detect COVID-19 features in lung ultrasound B-mode images. However, no previous work considers the application of these deep learning models to signal processing stages that occur prior to traditional ultrasound B-mode image formation. Considering the multiple signal processing stages required to achieve ultrasound B-mode images, our research objective is to investigate the most appropriate stage for our deep learning approach to COVID-19 B-line feature detection, starting with raw channel data received by an ultrasound transducer. Results demonstrate that for our given training and testing configuration, the maximum Dice similarity coefficient (DSC) was produced by B-mode images (DSC = 0.996) when compared with three alternative image formation stages that can serve as network inputs: (1) raw in-phase and quadrature (IQ) data before beamforming, (2) beamformed IQ data, (3) envelope detected IQ data. The best-performing simulation-trained network was tested on in vivo B-mode images of COVID-19 patients, ultimately achieving 76% accuracy to detect the same (82% of cases) or more (18% of cases) B-line features when compared to B-line feature detection by human observers interpreting B-mode images. Results are promising to proceed with future COVID-19 B-line feature detection using ultrasound B-mode images as the input to deep learning models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.