Abstract

Long Short Term Memory (LSTM)-based structures have demonstrated their efficiency for daily living recognition activities in smart homes by capturing the order of sensor activations and their temporal dependencies. Nevertheless, they still fail in dealing with the semantics and the context of the sensors. More than isolated id and their ordered activation values, sensors also carry meaning. Indeed, their nature and type of activation can translate various activities. Their logs are correlated with each other, creating a global context. We propose to use and compare two Natural Language Processing embedding methods to enhance LSTM-based structures in activity-sequences classification tasks: Word2Vec, a static semantic embedding, and ELMo, a contextualized embedding. Results, on real smart homes datasets, indicate that this approach provides useful information, such as a sensor organization map, and makes less confusion between daily activity classes. It helps to better perform on datasets with competing activities of other residents or pets. Our tests show also that the embeddings can be pretrained on different datasets than the target one, enabling transfer learning. We thus demonstrate that taking into account the context of the sensors and their semantics increases the classification performances and enables transfer learning.

Highlights

  • Received: 1 September 2021Accepted: 7 October 2021Published: 14 October 2021Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.Recent advances in the Internet of Things (IoT) hardwares, in terms of energy consumption, cost or inter-operationality [1], have boosted the development of smart environments, such as smart homes

  • We compare the embeddings obtained by Word2Vec and ELMo, and we show that transfer learning is possible with this approach

  • To find a suitable embedding for smart home data that can tackle the challenges described above, we review the methods deployed in ADLs recognition based on pattern recognition and spatio-temporal sequence analysis, as well as the methods used for extracting semantic features coming from the Natural Language Processing (NLP) domain

Read more

Summary

Introduction

Received: 1 September 2021Accepted: 7 October 2021Published: 14 October 2021Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.Recent advances in the Internet of Things (IoT) hardwares, in terms of energy consumption, cost or inter-operationality [1], have boosted the development of smart environments, such as smart homes. Smart homes can provide many other useful services, such as energy management or security systems. In order to offer both automated and customized services, a smart home must be able to understand the daily activities of the residents. Our goal is to deviate their use in order to extract contextual and semantic information from smart home sensors. Word2Vec. Word2Vec is one of the most popular techniques to learn word embeddings using shallow neural networks. Word2Vec is one of the most popular techniques to learn word embeddings using shallow neural networks It was developed by Mikolov et al [7]. Word2Vec is an unsupervised learning technique to learn continuous representations of words. There exist two main methods for learning representations of words in the Word2Vec algorithm (Figure 1): (1) the Continuous Bag Of Words (CBOW), trained by learning to predict the center word based on the context words, as in Figure 1a; (2) the

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call