Abstract

This paper presents a wearable device, fitted on the waist of a participant that recognizes six activities of daily living (walking, walking upstairs, walking downstairs, sitting, standing, and laying) through a deep-learning algorithm, human activity recognition (HAR). The wearable device comprises a single-board computer (SBC) and six-axis sensors. The deep-learning algorithm employs three parallel convolutional neural networks for local feature extraction and for subsequent concatenation to establish feature fusion models of varying kernel size. By using kernels of different sizes, relevant local features of varying lengths were identified, thereby increasing the accuracy of human activity recognition. Regarding experimental data, the database of University of California, Irvine (UCI) and self-recorded data were used separately. The self-recorded data were obtained by having 21 participants wear the device on their waist and perform six common activities in the laboratory. These data were used to verify the proposed deep-learning algorithm on the performance of the wearable device. The accuracy of these six activities in the UCI dataset and in the self-recorded data were 97.49% and 96.27%, respectively. The accuracies in tenfold cross-validation were 99.56% and 97.46%, respectively. The experimental results have successfully verified the proposed convolutional neural network (CNN) architecture, which can be used in rehabilitation assessment for people unable to exercise vigorously.

Highlights

  • With the popularization of wearable devices and reductions in their size and cost in recent years, sensors have been applied in human activity recognition (HAR)

  • The data are preprocessed through denoising and normalization, among other procedures, for subsequent feature extraction and for training HAR classifiers [13]

  • We placed the sensor on the waist for data collection, and the data were preprocessed by data normalization. It was used for subsequent feature extraction and training classifiers for related activity recognition

Read more

Summary

Introduction

With the popularization of wearable devices and reductions in their size and cost in recent years, sensors have been applied in human activity recognition (HAR). We placed the sensor on the waist for data collection, and the data were preprocessed by data normalization It was used for subsequent feature extraction and training classifiers for related activity recognition. The main contribution of this paper is to propose multi-scale feature extraction through multi-scale parallel convolutional neural networks, thereby improving the accuracy of human activity recognitions. A belt was used to fasten the wearable device on the participants’ waists to prevent it from shaking with their movements This increased stability facilitated data collection by the accelerometer and gyroscope. The number of data entries corresponding to each activity of daily living in the training, testing, and verification sets was 1293, 693, and 324, respectively.

HAR Algorithm
Network Framework
Assessment Indexes of the UCI Dataset
Confusion Matrix of the UCI Dataset
Various Assessment Indexes of Self-Recorded Data
Confusion Matrix of the Self-Recorded Data
Model Accuracy and Loss Function of the UCI Dataset
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call