Abstract

Video tracking is a fundamental problem in computer vision with many applications. The goal of video tracking is to isolate a target object from its background across a sequence of frames. Tracking is inherently a three dimensional problem in that it incorporates the time dimension. As such, the computational efficiency of video segmentation is a major challenge. In this paper we present a generic and robust graph-theory-based tracking scheme in videos. Unlike previous graph-based tracking methods, the suggested approach treats motion as a pixel's property (like color or position) rather than as consistency constraints (i.e., the location of the object in the current frame is constrained to appear around its location in the previous frame shifted by the estimated motion) and solves the tracking problem optimally (i.e., neither heuristics nor approximations are applied). The suggested scheme is so robust that it allows for incorporating the computationally cheaper MPEG-4 motion estimation schemes. Although block matching techniques generate noisy and coarse motion fields, their use allows faster computation times as broad variety of off-the-shelf software and hardware components that specialize in performing this task are available. The evaluation of the method on standard and non-standard benchmark videos shows that the suggested tracking algorithm can support a fast and accurate video tracking, thus making it amenable to real-time applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.