Abstract

Photoplethysmography (PPG) signals, typically acquired from wearable devices, hold significant potential for continuous fitness-health monitoring. In particular, heart conditions that manifest in rare and subtle deviating heart patterns may be interesting. However, robust and reliable anomaly detection within these data remains a challenge due to the scarcity of labeled data and high inter-subject variability. This paper introduces a two-stage framework leveraging representation learning and personalization to improve anomaly detection performance in PPG data. The proposed framework first employs representation learning to transform the original PPG signals into a more discriminative and compact representation. We then apply three different unsupervised anomaly detection methods for movement detection and biometric identification. We validate our approach using two different datasets in both generalized and personalized scenarios. Our results demonstrate significant improvements: for movement detection, in the generalized scenario, AUCs improved from barely 0.5 to above 0.9 with representation learning. Importantly, inter-subject variability was substantially reduced, from around 0.4 to below 0.1. In the personalized scenario, AUCs became close to 1.0, with variability further reduced to below 0.05, indicating the effectiveness of both representation learning and personalization for anomaly detection in PPG data. Similar enhancements were observed in biometric identification, emphasizing how our approach can minimize inter-subject variability and enhance PPG-based health monitoring systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call