Abstract

Identifying correspondences between trajectory segments observed from nonsynchronized cameras is important for reconstruction of the complete trajectory of moving targets in a large scene. Such a reconstruction can be obtained from motion data by comparing the trajectory segments and estimating both the spatial and temporal alignments. Exhaustive testing of all possible correspondences of trajectories over a temporal window is only viable in the cases with a limited number of moving targets and large view overlaps. Therefore, alternative solutions are required for situations with several trajectories that are only partially visible in each view. In this paper, we propose a new method that is based on view-invariant representation of trajectories, which is used to produce a sparse set of salient points for trajectory segments observed in each view. Only the neighborhoods at these salient points in the view--invariant representation are then used to estimate the spatial and temporal alignment of trajectory pairs in different views. It is demonstrated that, for planar scenes, the method is able to recover with good precision and efficiency both spatial and temporal alignments, even given relatively small overlap between views and arbitrary (unknown) temporal shifts of the cameras. The method also provides the same capabilities in the case of trajectories that are only locally planar, but exhibit some nonplanarity at a global level.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.