Abstract

Wearable-based human-computer interaction is a promising technology to enable various applications. This paper aims to track the 3D posture of the entire limb, both wrist/ankle and elbow/knee, of a user wearing a smart device. This limb tracking technology can trace the geometric motion of the limb, without introducing any training stage usually required in gesture recognition approaches. Nonetheless, the tracked limb motion can also be used as a generic input for gesture-based applications. The 3D posture of a limb is defined by the relative positions among main joints, e.g., shoulder, elbow, and wrist for an arm. When a smartwatch is worn on the wrist of a user, its position is affected by both elbow and shoulder motions. It is challenging to infer the entire 3D posture when only given a single point of sensor data from the smartwatch. In this paper, we propose LimbMotion, an accurate and real-time limb tracking system. The performance gain of LimbMotion comes from multiple key technologies, including an accurate attitude estimator based on a novel two-step filter, fast acoustic ranging, and point clouds-based positioning. We implemented LimbMotion and evaluated its performance using extensive experiments, including different gestures, moving speeds, users, and limbs. Results show that LimbMotion achieves real-time tracking with a median error of 7.5cm to 8.9cm, which outperforms the state-of-the-art approach by about 32%.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.