Abstract
One of the major challenges of human movement identification in indoor environments is sensitivity to many uncommon indoor interactions, such as falling off an object or moving a chair. This work investigates human footstep movements using multiple modalities and analyzes their representations from a small self-collected dataset of acoustic and vibration-based sensors. The core idea of this study is to learn apparent similarities between two sensory traits (not limited to microphone and geophone) and combine representations from multiple sensors. For this purpose, we describe a novel metric-based learning approach that introduces a multimodal frame-work and uses deep audio and geophone encoders in Siamese configuration to design an adaptable and lightweight self-supervised model to detect human movements. This framework eliminates the need for expensive data labeling procedures and learns general-purpose representations from low multisensory data obtained from omnipresent sensing systems. Next, we learn temporal and spatial features extracted from audio and geophone signals using this expressive design. Then, we extract the representations in a shared space to maximize the learning of a compatibility function between acoustic and geophone features. We effectively detected human movement from multiple sensor modalities with a 99.9% accuracy from the learned model. This method increases the identification of human movements in indoor-sensitive environments. We also propose further investigation to demonstrate generalization and effectiveness by conducting extensive experiments on datasets from various disciplines and in multiple settings.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.