Abstract
Visual tracking is required by many vision applications such as human-computer interfaces and human-robot interactions. However, in daily living spaces where such applications are assumed to be used, stable tracking is often difficult because there are many objects which can cause the visual occlusion. While conventional tracking techniques can handle, to some extent, partial and short-term occlusion, they fail when presented with complete occlusion over long periods. They also cannot handle the case that an occluder such as a box and a bag contains and carries the tracking target inside itself, that is, the case that the target invisibly moves while being contained by the occluder. In this paper, to handle this occlusion problem, we propose a method for visual tracking by a particle filter, which switches tracking targets autonomously. In our method, if occlusion occurs during tracking, a model of the occluder is dynamically created and the tracking target is switched to this model. Thus, our method enables the tracker to indirectly track the “invisible target” by switching its target to the occluder effectively. Experimental results show the effectiveness of our method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.