Abstract
Non-contact respiratory monitoring is gaining attention due to its unobtrusive form of measurement. Among all other non-contact sensing modalities, optical sensors integrated with commercial video cameras have received more attention due to their ease of use and low cost, allowing non-experts to employ these methods without any prior knowledge. Prior research mostly focused on extracting breathing rate and heart rate-related information for sedentary breathing patterns using camera to determine health status. However, for abnormal breathing patterns, a range of respiratory-pattern-related signatures need to be monitored to investigate the associated primary cause of pattern change. In this paper, we propose a novel feature extraction technique to recognize respiratory patterns such as eupnea, tachypnea, bradypnea, and apnea retrieved from videos captured using a single digital camera working in the visible range. At this aim, a hyper-feature algorithm has been implemented to extract distinguishable air-flow-related from videos collected on twenty-four participants having four different respiratory patterns. After the training of the algorithm using the reference respiratory signal obtained from a medical-grade chest strap, the performance of the system has been evaluated on additional five participants whose measurement was not used for algorithm development and on seven patients. Results demonstrated an accuracy of 96.07% in the recognition of all the respiratory patterns, with minimum performances (81.81%) in detecting bradypnea and 100% in detecting apnea events. Additionally, including the patient’s dataset with the apnea and the normal breathing pattern, the accuracy of the proposed algorithm becomes 97.39%, demonstrating its robustness on the patient’s dataset.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have