Abstract
Image registration has been long used as a basis for the detection of moving objects. Registration techniques attempt to discover correspondences between consecutive frame pairs based on image appearances under rigid and affine transformations. However, spatial information is often ignored, and different motions from multiple moving objects cannot be efficiently modeled. Moreover, image registration is not well suited to handle occlusion that can result in potential object misses. This paper proposes a novel approach to address these problems. First, segmented video frames from unmanned aerial vehicle captured video sequences are represented using region adjacency graphs of visual appearance and geometric properties. Correspondence matching (for visible and occluded regions) is then performed between graph sequences by using multigraph matching. After matching, region labeling is achieved by a proposed graph coloring algorithm which assigns a background or foreground label to the respective region. The intuition of the algorithm is that background scene and foreground moving objects exhibit different motion characteristics in a sequence, and hence, their spatial distances are expected to be varying with time. Experiments conducted on several DARPA VIVID video sequences as well as self-captured videos show that the proposed method is robust to unknown transformations, with significant improvements in overall precision and recall compared to existing works.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.