Abstract

In this article, a novel framework that fuses the posture data taken by a drone (or unmanned aerial vehicle, UAV) camera and the wearable sensors data recorded by smartwatches is proposed. The framework is designed for continuously tracking persons in a drone view by analyzing location-independent human posture features and correctly tagging smartwatch identities (IDs) and personal profiles to video human objects, thus conquering the former work in requiring ground markers. Person detection, ID assignment, and pose estimation are integrated into our framework to obtain recognized human postures. These recognized postures are then paired with those from the wearable sensors. Through fusing common postures, such as standing, walking, jumping, and falling down, person tracking accuracy by UAV up to 95.36% can be attained in our testing scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call