Abstract

Human activity recognition (HAR) plays an important role in a wide range of applications, such as health monitoring and gaming. Inertial sensors attached to body segments constitute a critical sensing system for HAR. Diverse inertial sensor datasets for HAR have been released with the intention of attracting collective efforts and saving the data collection burden. However, these datasets are heterogeneous in terms of subjects and sensor positions. The coupling of these two factors makes it hard to generalize the model to a new application scenario, where there are unseen subjects and new sensor position combinations. In this paper, we design a framework to combine heterogeneous data to learn a general representation for HAR, so that it can work for new applications. We propose an Augmented Adversarial Learning framework for HAR (AALH) to learn generalizable representations to deal with diverse combinations of sensor positions and subject discrepancies. We train an adversarial neural network to map various sensor sets' data into a common latent representation space which is domain-invariant and class-discriminative. We enrich the latent representation space by a hybrid missing strategy and complement each subject domain with a multi-domain mixup method, and they significantly improve model generalization. Experiment results on two HAR datasets demonstrate that the proposed method significantly outperforms previous methods on unseen subjects and new sensor position combinations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call