Abstract

Complex activities refer to users' activities performed in their daily lives (e.g., having dinner, shopping, etc.). Complex activity recognition is a valuable issue in wearable and mobile computing. The time-series sensory data from multimodal sensors have sophisticated relationships to characterize the complex activities (e.g., intra-sensor relationships, inter-sensor relationships, and temporal relationships), making the traditional methods based on manually designed features ineffective. To this end, we propose HConvRNN, an end-to-end deep neural network for complex activity recognition using multimodal sensors by integrating convolutional neural network (CNN) and recurrent neural network (RNN). To be specific, it uses a hierarchical CNN to exploit the intra-sensor relationships among similar sensors and merge intra-sensor relationships of different sensor modalities into inter-sensor relationships, and uses a RNN to model the temporal relationships of signal dynamics. The experiments based on real-world datasets show that HConvRNN outperforms the existing complex activity recognition methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call