Abstract

Wearable sensors have great potential to ensure the safety and security of humans in hazardous and unknown environments. Real-time tracking and mapping are key aspects of safety and security. Integrating body-mounted wearable sensors measurements for human tracking is challenging due to the high degree of freedom of human motion. Particularly, time synchronization and spatial alignment between different sensors make the integration of measurements challenging. Also, weight, power, and computational resources impose additional constraints. This article addresses these challenges by demonstrating the design and implementation of a multisensor pedestrian tracking system that integrates, spatially aligns, and time synchronizes state-of-art wearable tracking and positioning sensors/technologies. The integrated measurements are packaged into labelled datasets that will be publicly available and easily accessible for researchers. The dataset includes raw and processed data of multiple wearable inertial sensors, a stereo camera, a handheld LiDAR, and an ultrawide-band (UWB) receiver. The dataset is supported by a high-accuracy ground truth reference generated by a post-processed 3D LiDAR SLAM engine. Assessment of state-of-art tracking technologies/sensors is covered, and details of tracking algorithms and their performance and challenges are demonstrated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call