Abstract

Regular monitoring of common physiological signs, including heart rate, blood pressure, and oxygen saturation, can be an effective way to either prevent or detect many kinds of chronic conditions. In particular, cardiovascular diseases (CVDs) are a worldwide concern. According to the World Health Organization, 32% of all deaths worldwide are from CVDs. In addition, stress-related illnesses cost $190 billion in healthcare costs per year. Currently, contact devices are required to extract most of an individual’s physiological information, which can be uncomfortable for users and can cause discomfort. However, in recent years, remote photoplethysmography (rPPG) technology is gaining interest, which enables contactless monitoring of the blood volume pulse signal using a regular camera, and ultimately can provide the same physiological information as a contact device. In this paper, we propose a benchmark comparison using a new multimodal database consisting of 56 subjects where each subject was submitted to three different tasks. Each subject wore a wearable device capable of extracting photoplethysmography signals and was filmed to allow simultaneous rPPG signal extraction. Several experiments were conducted, including a comparison between information from contact and remote signals and stress state recognition. Results have shown that in this dataset, rPPG signals were capable of dealing with motion artifacts better than contact PPG sensors and overall had better quality if compared to the signals from the contact sensor. Moreover, the statistical analysis of the variance method had shown that at least two heart-rate variability (HRV) features, NNi 20 and SAMPEN, were capable of differentiating between stress and non-stress states. In addition, three features, inter-beat interval (IBI), NNi 20, and SAMPEN, were capable of differentiating between tasks relating to different levels of difficulty. Furthermore, using machine learning to classify a "stressed" or "unstressed" state, the models were able to achieve an accuracy score of 83.11%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call