Abstract
Motion information is regarded as one of the most important cues for developing semantics in video data. Yet it is extremely challenging to build semantics of video clips particularly when it involves interactive motion of multiple objects. Most of the existing research has focused on capturing and modelling the motion of each object individually thus loosing interaction information. Such approaches yield low precision-recall ratios and limited indexing and retrieval performances. This paper presents a novel framework for compact representation of multi-object motion trajectories. Three efficient multi-trajectory indexing and retrieval algorithms based on multilinear algebraic representations are proposed. These include: (i) geometrical multiple-trajectory indexing and retrieval (GMIR), (ii) unfolded multiple-trajectory indexing and retrieval (UMIR), and (iii) concentrated multiple-trajectory indexing and retrieval (CMIR). The proposed tensor-based representations not only remarkably reduce the dimensionality of the indexing space but also enable the realization of fast retrieval systems. The proposed representations and algorithms can be robustly applied to both full and partial (segmented) multiple motion trajectories with varying number of objects, trajectory lengths, and sampling rates. The proposed algorithms have been implemented and evaluated using real video datasets. Simulation results demonstrate that the CMIR algorithm provides superior precision-recall metrics, and smaller query processing time compared to the other approaches.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Circuits and Systems for Video Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.