Abstract

Smart sensing devices are furnished with an array of sensors, including locomotion sensors, which enable continuous and passive monitoring of human activities for the ambient assisted living. As a result, sensor-based human activity recognition has earned significant popularity in the past few years. A lot of successful research studies have been conducted in this regard. However, the accurate recognition of in-the-wild human activities in real-time is still a fundamental challenge to be addressed as human physical activity patterns are adversely affected by their behavioral contexts. Moreover, it is essential to infer a user's behavioral context along with the physical activity to enable context-aware and knowledge-driven applications in real-time. Therefore, this research work presents “C2FHAR”, a novel approach for coarse-to-fine human activity recognition in-the-wild, which explicitly models the user's behavioral contexts with activities of daily living to learn and recognize the fine-grained human activities. For addressing real-time activity recognition challenges, the proposed scheme utilizes a multi-label classification model for identifying in-the-wild human activities at two different levels, i.e., coarse or fine-grained, depending upon the real-time use-cases. The proposed scheme is validated with extensive experiments using heterogeneous sensors, which demonstrate its efficacy.

Highlights

  • The progression of the Internet of Things (IoT) and smart sensing technologies has made ubiquitous computing an indispensable platform for assisting people in their routine life

  • The proposed C2FHAR scheme is evaluated on the ExtraSensory dataset using Random Forest (RF), Decision Tree (DT), and Neural Networks (NN) classifiers

  • It can be observed from the table that the smartphone accelerometer (S-ACC) sensor achieves an average balanced accuracy (BAC) value of 80.7% for the coarse-level ADLs recognition using RF classifier, which is 2.8% and 5.7% more than the average BAC values of 77.9% and 75.0% attained with DT and NN classifiers, respectively

Read more

Summary

INTRODUCTION

The progression of the Internet of Things (IoT) and smart sensing technologies has made ubiquitous computing an indispensable platform for assisting people in their routine life. DATA ACQUISITION For validating our proposed scheme, we opted to use the publicly available ExtraSensory [15] dataset that conforms to the pipeline of C2FHAR model This specific dataset is selected on account of three key reasons: 1) the dataset is collected in-the-wild from 60 users, including 26 males and 34 females, without imposing any restriction on the users regarding the ADLs execution, 2) the dataset contains a large number of context labels associated with the selected ADLs, which provide supplementary information about a user’s context in-the-wild, and 3) the dataset consists of heterogeneous data from both smartphone inertial sensors (i.e., accelerometer and gyroscope) and the watch accelerometer. We used the same window size for feature extraction

FEATURE EXTRACTION
ACTIVITY RECOGNITION
Procedure:
METHOD OF ANALYSIS AND CLASSIFIER TUNING
Findings
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call