Abstract
Human activity recognition (HAR) is becoming more significant in various industrial applications, including medical consideration and rehabilitation monitoring. With the rapid development of information and communication technologies, wearable technologies have inspired a new variety of human-computer interaction. Wearable inertial sensors are typically operated in the area of HAR because they deliver the most critical data on motion signals. Researchers in HAR continuously investigate alternative methodologies and signal sources to enhance HAR approaches. This study aims to determine the effect of combining biosignals with a publicly available dataset derived from wearable sensors on detecting everyday human activities. We utilized the MHEALTH dataset, which contains electrocardiogram (ECG), accelerometer, gyro-scope, and magnetometer data gathered from ten individuals engaged in twelve everyday actions. We proposed a unique deep learning technique for automatically extracting features and constructing a recognition model in several circumstances, including integrated sensor data. Our findings indicate that combining the ECG and IMU signals increases the F1-score of the classifier by 11.53%, from 86.83% to 98.36%.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have