Abstract
This paper presents a robust real-time object tracking system for human computer interaction in mediated environments with interfering visual projection in the background. Two major contributions are made in our research to achieve robust object tracking. A reliable outlier rejection algorithm is developed using the epipolar and homography constraints to remove false candidates caused by interfering background projections and mismatches between cameras. To reliably integrate multiple estimates of the 3D object positions, an efficient fusion algorithm based on mean shift is used. This fusion algorithm can also reduce tracking errors caused by partial occlusion of the object in some of the camera views. Experimental results obtained in real life scenarios demonstrate that the proposed system is able to achieve decent 3D object tracking performance in the presence of interfering background visual projection.
Highlights
Movement-driven mediated environments attract increasing interests many interactive applications, including mediated and interactive learning, performing arts, and rehabilitation, just to name a few
We focus our discussion on robust 3D object tracking in complex environment with dynamic, interfering background visual projections
Since the goal is to locate the modes from the 3D candidates, instead of learning the complete parameters of the mixture of Gaussian (MoG) using the expectation-maximization (EM) algorithm, a fast mode-seeking procedure based on mean shift is taken in our proposed approach
Summary
Movement-driven mediated environments attract increasing interests many interactive applications, including mediated and interactive learning, performing arts, and rehabilitation, just to name a few. Reliable and precise tracking of multiple objects from video in visually mediated environments is a nontrivial problem for computer vision. Visual projections used as part of the real-time feedback in a mediated environment often present a fast-changing and dynamic background. To the best of our knowledge, no existing vision-based systems have been reported to be able to reliably track objects in 2D or 3D in environments with the interfering background projections that we are dealing with in this paper. To overcome the aforementioned challenges, in this paper we present a working system we have developed for real-time 3D tracking of objects in a mediated environment where interfering visual feedback is projected onto the ground and vertical planes. As mentioned in the future work, we are extending the proposed system by including inertial sensors for the orientation recovery of the objects
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.