Abstract

Often sensor ego-motion or fast target movement causes the target to temporarily go out of the field-of-view leading to reappearing target detection problem in target tracking applications. Since the target goes out of the current frame and re-enters at a later frame, the re-entering location and variations in rotation, scale, and other three-dimensional orientations of the target are not known, thus complicating the detection and tracking of reappearing targets. A new training-based target detection algorithm has been developed using tuned basis functions (TBFs). The detection algorithm uses target and background information, extracted from training samples, to detect possible candidate target images. The detected candidate target images are then introduced into the second algorithm, called clutter rejection module, to determine the target re-entering frame and location of the target. The second algorithm has been designed using the spatial domain correlation-based template matching (TM) technique. If the target re-enters the current frame, the target coordinates are detected and tracking algorithm is initiated. The performance of the proposed TBF-TM-based reappearing target detection algorithm has been tested using real-world forward-looking infrared video sequences.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.