Abstract

Sensor-based human activity recognition (S-HAR) has become an important and high-impact topic of research within human-centered computing. In the last decade, successful applications of S-HAR have been presented through fruitful academic research and industrial applications, including for healthcare monitoring, smart home controlling, and daily sport tracking. However, the growing requirements of many current applications for recognizing complex human activities (CHA) have begun to attract the attention of the HAR research field when compared with simple human activities (SHA). S-HAR has shown that deep learning (DL), a type of machine learning based on complicated artificial neural networks, has a significant degree of recognition efficiency. Convolutional neural networks (CNNs) and recurrent neural networks (RNNs) are two different types of DL methods that have been successfully applied to the S-HAR challenge in recent years. In this paper, we focused on four RNN-based DL models (LSTMs, BiLSTMs, GRUs, and BiGRUs) that performed complex activity recognition tasks. The efficiency of four hybrid DL models that combine convolutional layers with the efficient RNN-based models was also studied. Experimental studies on the UTwente dataset demonstrated that the suggested hybrid RNN-based models achieved a high level of recognition performance along with a variety of performance indicators, including accuracy, F1-score, and confusion matrix. The experimental results show that the hybrid DL model called CNN-BiGRU outperformed the other DL models with a high accuracy of 98.89% when using only complex activity data. Moreover, the CNN-BiGRU model also achieved the highest recognition performance in other scenarios (99.44% by using only simple activity data and 98.78% with a combination of simple and complex activities).

Highlights

  • Human-centered computing is a new area of study and application that focuses on understanding human behavior and combining users and their social backgrounds with digital technology

  • The recognition performance of the proposed Convolutional neural networks (CNNs)-bidirectional gated recurrent units (BiGRU) model for complex human activity recognition is evaluated

  • We introduced a framework for Sensor-based human activity recognition (S-Human activity recognition (HAR)) to address the problem of the recognition of complex human activities (CHA) by using wrist-worn wearable sensors

Read more

Summary

Introduction

Human-centered computing is a new area of study and application that focuses on understanding human behavior and combining users and their social backgrounds with digital technology. Successful recognition of human activities can be extensively useful in ambient assisted living (AAL) applications [2] such as intelligent activity monitoring systems developed for elderly and disabled people in healthcare systems [3], automatic interpretation of hand gestures in sports [4], user identify verification for security systems using gait characteristics [5], and human–robot interactions through gesture recognition [6]. The objectives of HAR systems are to (1) determine (both online and offline) the ongoing actions/activities of an individual, a group of individuals, or even a community based on sensory observation data; (2) identify certain individual characteristics such as the identity of people in a particular frame, gender, age, and so on; and (3) increase awareness concerning the context in which observational interactions have been happening. The following research and development types in HAR systems can be identified [9]: (1) HAR systems based on visual information (images and videos), (2) HAR systems based on motion inertial sensors like

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.