Abstract
In the past decade, touchless interaction with objects has drawn increasing attention in a wide range of applications, from entertainment to the real-time control of robots. For this purpose, many vision-based hand tracking devices such as Leap Motion and Microsoft Kinect were developed. However, there is still a need for improvement for the successful realization of these sensors in delicate interaction scenarios. Major concerns are the low precision, reliability, and robustness to occlusions that occur from the nature of the utilized sensors. Consequently, in this article, an adaptive multisensor fusion methodology is proposed for hand pose estimation with two Leap Motions. A registration-based self-calibration algorithm is implemented to find the calibration matrix between sensor reference frames. Two separate Kalman filters are adopted for adaptive sensor fusion of palm position and orientation. The proposed adaptive sensor fusion method is verified with various experiments in six degrees of freedom in space. It is possible to see that the developed adaptive methodology can perform stable and continuous hand pose estimation in real-time, even when a single sensor is unable to detect the hand. Results showed that there had been significant improvement in the smoothness of pose estimations without affecting from occlusion on the one sensor.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
More From: IEEE Transactions on Instrumentation and Measurement
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.