Abstract
Remote photoplethysmography (rPPG) enables noncontact heart rate (HR) estimation using facial videos and exhibits significant convenience over traditional contact-based HR measurement approaches. However, its effectiveness depends mainly on the number of pixels of the facial region; thus, the estimation accuracy is easily degraded due to missing rPPG information when the resolution of facial images is low. Given that, in most real application scenarios, such as group-oriented and long-distance vital signs measurement, only low-resolution facial videos are available, finding a solution to recover rPPG information is a key to improve the performance of noncontact HR measurement. In this article, we propose a two-stage deep learning scheme to achieve accurate HR measurement from low-resolution facial videos. In the first stage, an rPPG information recovery network is proposed to recover the rPPG information during video super-resolution (SR) processing. The up-scaled images are then fed into a temporally aware HR measurement network in the second stage. An attention mechanism that reassigns the weights of the temporal information in each feature channel is designed to improve the measurement accuracy. The mean absolute percentage error achieves 3.320 and 4.231, while Pearson’s correlation coefficient reaches 0.930 and 0.892 on two datasets, respectively. Experimental results show that rPPG information could be effectively recovered through SR processing. Compared with the state-of-the-art approaches that treat high-resolution images as the input, the proposed method achieved closely comparable measurement performance using merely low-resolution images.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Instrumentation and Measurement
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.