Abstract

Driven by the desire for feasible and convenient healthcare, non-contact heart rate (HR) monitoring based on consumer-grade cameras has gained significant recognition among researchers. However, this technology suffers from performance reliability and consistency in realistic situations of motion artifacts, illumination variations, and skin tones, limiting it to emerge as an alternative to conventional methods. Considering these challenges, this paper suggests an effective technique for HR measurement from facial RGB videos. The face being the region of interest (ROI) is divided into several small sub-ROIs of even size. A group of quality sub-ROIs is formed and weighted based on the fundamental periodicity coefficient to handle spatial non-uniform illumination and facial motions. Five different color spaces are considered, and the most suitable color component from each space is chosen to alleviate the influence of temporal illumination variation and other factors. The resultant color signals are denoised using the ensemble empirical mode decomposition and integrated using the principal component analysis to derive a pulsating component representing the blood volumetric changes for HR computation. Experiments are conducted over three standard datasets, namely PURE, UBFC, and COHFACE. The obtained mean absolute error values are 1.16 beats per minute (bpm), 1.56 bpm, and 2.10 bpm for PURE, UBFC, and COHFACE datasets, respectively, indicating the performance of the technique well above the clinically acceptable threshold. In comparison, the technique showed performance superiority over the state-of-art methods. These outcomes substantiate the potential of alternative color spaces for accurate and reliable HR monitoring from facial videos in challenging scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call