Abstract

The paper proposes a graph-theoretical approach to auscultation, bringing out the potential of graph features in classifying the bioacoustics signals. The complex network analysis of the bioacoustics signals - vesicular (VE) and bronchial (BR) breath sound - of 48 healthy persons are carried out for understanding the airflow dynamics during respiration. The VE and BR are classified by the machine learning techniques extracting the graph features – the number of edges (E), graph density (D), transitivity (T), degree centrality (Dcg) and eigenvector centrality (Ecg). The higher value of E, D, and T in BR indicates the temporally correlated airflow through the wider tracheobronchial tract resulting in sustained high-intense low-frequencies. The frequency spread and high-frequencies in VE, arising due to the less correlated airflow through the narrow segmental bronchi and lobar, appears as a lower value for E, D, and T. The lower values of Dcg and Ecg justify the inferences from the spectral and other graph parameters. The study proposes a methodology in remote auscultation that can be employed in the current scenario of COVID-19.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call