Abstract
In recent years, visual tracking has been employed in all walks of life. The Siamese trackers formulate the tracking problem as a template-matching process, and most of them can meet the real-time requirements, making them more suitable for UAV tracking. Because existing trackers can only use the first frame of a video sequence as a reference, the appearance of the tracked target will change when an occlusion, fast motion, or similar target appears, resulting in tracking drift. It is difficult to recover the tracking process once the drift phenomenon occurs. Therefore, we propose a motion-aware Siamese framework to assist Siamese trackers in detecting tracking drift over time. The base tracker first outputs the original tracking results, after which the drift detection module determines whether or not tracking drift occurs. Finally, the corresponding tracking recovery strategies are implemented. More stable and reliable tracking results can be obtained using the Kalman filter’s short-term prediction ability and more effective tracking recovery strategies to avoid tracking drift. We use the Siamese region proposal network (SiamRPN), a typical representative of an anchor-based algorithm, and Siamese classification and regression (SiamCAR), a typical representative of an anchor-free algorithm, as the base trackers to test the effectiveness of the proposed method. Experiments were carried out on three public datasets: UAV123, UAV20L, and UAVDT. The modified trackers (MaSiamRPN and MaSiamCAR) both outperformed the base tracker.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have