Abstract

Human activity recognition based on wearable sensors plays an essential role in promoting many practical applications, such as healthcare, motion monitoring, medical examination, anomaly detection and human-computer interaction. It’s worth noting that longer temporal sensory sequences could reflect the characteristics of different daily activities more accurately. However, existing GANs-based time series generation methods could only synthesize uniaxial, multivariate or multidimensional sensor data over a relatively short span of time. These shorter synthetic time series could not effectively represent at least one complete daily activity cycle. To synthesize longer and more realistic multi-axial sensor data, this paper proposes a new customized GANs-based sensory data synthesizing method, which is dedicated to wearable activity recognition tasks, named Conditional SensoryGANs. Firstly, the elaborately designed MultiScale MultiDimensional (MSMD) spatiotemporal function module endows the proposed Conditional SensoryGANs with the capability of synthesizing longer sensory sequences, which could better characterize different behaviors with periodicity. Secondly, benefited from the well-designed Time-Frequency Enhancement (TFE) functional module, Conditional SensoryGANs could more accurately capture each axis’s spatiotemporal property and spatial correlation between different axes to improve the fidelity of synthetic sensor data. Thirdly, Conditional SensoryGANs could synthesize verisimilar wearable sensor data of the specified quantity and category under a unified framework with the embedded condition’s refined control. Qualitative visual evaluations demonstrate that the proposed method has more excellent capability for synthesizing verisimilar wearable multi-axial sensor data than the state-of-the-art GAN-based sensor data generation methods. Quantitative experiments also prove that it could achieve better results than off-the-shelf GANs-based time series methods for synthesizing wearable multi-axial sensor data. Meanwhile, empirical results demonstrate that synthetic sensor data from Conditional SensoryGANs can achieve comparatively approximate usability in the field of wearable human activity recognition than the real sensor data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call