Abstract

Deep learning methods have achieved the state-of-the-art performance of object detection and tracking in natural images, such as keypoint-based detectors and appearance/motion-based trackers. However, for small and blurry moving vehicles in satellite videos, keypoint-based detectors cause the missed detection of keypoints and incorrect keypoint matching. In terms of multi-object tracking, it is difficult to track the crowded similar vehicles stably only by using the appearance or motion information. To address these problems, a novel deep learning framework is proposed for moving vehicle detection and tracking in the satellite videos. It is comprised of the cross-frame keypoint-based detection network (CKDNet) and spatial motion information-guided tracking network (SMTNet). In CKDNet, a customized cross-frame module is designed to assist the detection of keypoints by exploiting complementary information between frames. Furthermore, CKDNet improves keypoint matching by incorporating size prediction around the keypoints and defining the soft mismatch suppression for out-of-size keypoint pairs. Based on high-quality detection, SMTNet can track the densely-packed vehicles effectively by constructing two-branch long short-term memories. It extracts not only spatial information of the same frame by considering the relative spatial relationship of neighboring vehicles, but also motion information among consecutive frames by calculating the movement velocity. Especially, it regresses virtual positions for missed or occluded vehicles and keeps on tracking these vehicles while they reappear. Experimental results on Jilin-1 and SkySat satellite videos demonstrate the effectiveness of the proposed detection and tracking methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.