Abstract

The respiration rate (RR) is one of the physiological signals deserving monitoring for assessing human health and emotional states. However, traditional devices, such as the respiration belt to be worn around the chest, are not always a feasible solution (e.g., telemedicine, device discomfort). Recently, novel approaches have been proposed aiming at estimating RR in a less invasive yet reliable way, requiring the acquisition and processing of contact or remote Photoplethysmography (contact reference and remote-PPG, respectively). The aim of this paper is to address the lack of systematic evaluation of proposed methods on publicly available datasets, which currently impedes a fair comparison among them. In particular, we evaluate two prominent families of PPG processing methods estimating Respiratory Induced Variations (RIVs): the first encompasses methods based on the direct extraction of morphological features concerning the RR; and the second group includes methods modeling respiratory artifacts adopting, in the most promising cases, single-channel blind source separation. Extensive experiments have been carried out on the public BP4D+ dataset, showing that the morphological estimation of RIVs is more reliable than those produced by a single-channel blind source separation method (both in contact and remote testing phases), as well as in comparison with a representative state-of-the-art Deep Learning-based approach for remote respiratory information estimation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.