Abstract

Inertial signals are the most widely used signals in human activity recognition (HAR) applications, and extensive research has been performed on developing HAR classifiers using accelerometer and gyroscope data. This study aimed to investigate the potential enhancement of HAR models through the fusion of biological signals with inertial signals. The classification of eight common low-, medium-, and high-intensity activities was assessed using machine learning (ML) algorithms, trained on accelerometer (ACC), blood volume pulse (BVP), and electrodermal activity (EDA) data obtained from a wrist-worn sensor. Two types of ML algorithms were employed: a random forest (RF) trained on features; and a pre-trained deep learning (DL) network (ResNet-18) trained on spectrogram images. Evaluation was conducted on both individual activities and more generalized activity groups, based on similar intensity. Results indicated that RF classifiers outperformed corresponding DL classifiers at both individual and grouped levels. However, the fusion of EDA and BVP signals with ACC data improved DL classifier performance compared to a baseline DL model with ACC-only data. The best performance was achieved by a classifier trained on a combination of ACC, EDA, and BVP images, yielding F1-scores of 69 and 87 for individual and grouped activity classifications, respectively. For DL models trained with additional biological signals, almost all individual activity classifications showed improvement (p-value < 0.05). In grouped activity classifications, DL model performance was enhanced for low- and medium-intensity activities. Exploring the classification of two specific activities, ascending/descending stairs and cycling, revealed significantly improved results using a DL model trained on combined ACC, BVP, and EDA spectrogram images (p-value < 0.05).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call