Abstract

Common visual features used in target tracking, including colour and grayscale, are prone to failure in a confusingly similar-looking background. As the technology of three-dimensional visual information acquisition has gradually gained ground in recent years, the conditions for the wide use of depth information in target tracking has been made available. This study focuses on discussing the possible ways to introduce depth information into the generative target tracking methods based on a kernel density estimation as well as the performance of different methods of introduction, thereby providing a reference for the use of depth information in actual target tracking systems. First, an analysis of the mean-shift technical framework, a typical algorithm used for generative target tracking, is described, and four methods of introducing the depth information are proposed, i.e., the thresholding of the data source, thresholding of the density distribution of the dataset applied, weighting of the data source, and weighting of the density distribution of the dataset. Details of an experimental study conducted to evaluate the validity, characteristics, and advantages of each method are then described. The experimental results showed that the four methods can improve the validity of the basic method to a certain extent and meet the requirements of real-time target tracking in a confusingly similar background. The method of weighting the density distribution of the dataset, into which depth information is introduced, is the prime choice in engineering practise because it delivers an excellent comprehensive performance and the highest level of accuracy, whereas methods such as the thresholding of both the data sources and the density distribution of the dataset are less time-consuming. The performance in comparison with that of a state-of-the-art tracker further verifies the practicality of the proposed approach. Finally, the research results also provide a reference for improvements in other target tracking methods in which depth information can be introduced.

Highlights

  • Video target tracking refers to the continuous tracking of the state of a target in a sequence of frames subsequent to the given initial position and scale information of the target

  • In the methods described in section Mean-Shift Target Tracking Method, depth information is introduced into the thresholding of G by the following formula once the grayscale image to be tracked becomes available: TG x, y =

  • In the method described in section Mean-Shift Target Tracking Method, depth information is introduced into the thresholding of P using the following formula once back projection P of the grayscale image to be tracked becomes available: TP x, y =

Read more

Summary

INTRODUCTION

Video target tracking refers to the continuous tracking of the state of a target in a sequence of frames subsequent to the given initial position and scale information of the target. A mean-shift is a typical generative method for tracking targets based on a kernel density estimation It follows a certain similarity measure criterion to calculate the degree to which every region of a data source image matches the visual features of the tracked target. To apply the mean-shift algorithm for target tracking, one must first calculate the degree to which every area of an image matches the visual features of the target being tracked using a certain method, and present the calculation result in the form of the density distribution of the two-dimensional dataset. With the two-dimensional dataset density distribution derived from the histogram back projection, the mean-shift tracking algorithm makes it possible to track the target by iteratively locating the region most similar to the target: Step 1 - Based on the position in the last frame, place the initial search window somewhere in the density distribution of the two-dimensional dataset.

METHODS
Method names
Findings
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.