Abstract

With the advancement and ubiquitousness of wearable devices, wearable sensor-based human activity recognition (HAR) has become a prominent research area in the healthcare domain and human-computer interaction. Inertial measurement unit (IMU) which can provide a wide range of information such as acceleration, angular velocity has become one of the most commonly used sensors in HAR. Recently, with the growing demand for soft and flexible wearable devices, mountable stretch sensors have become a new promising modality in wearable sensor-based HAR. In this paper, we propose a deep learning-based multi-modality HAR framework which consists of three IMUs and two fabric stretch sensors in order to evaluate the potential of stretch sensors independently and in combination with IMU sensors for the activity recognition task. Three different deep learning algorithms: long short-term memory (LSTM), convolutional neural network (CNN) and hybrid CNN-LSTM are deployed to the sensor data for automatically extracting deep features and performing activity classification. The impact of sensor type on recognition accuracy of different activities is also examined in this study. A dataset collected from the proposed framework, namely iSPL IMU-Stretch and a public dataset called w-HAR are used for experiments and performance evaluation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call