Abstract

Most of the existing learning-based methods for video search take query examples as ¿positive¿ and build a model for each query. These methods, referred to as query-dependent, only achieve limited success as users are mostly reluctant to provide enough query examples. To address this problem, we propose a novel query-independent learning approach based on multigraph to video search, which learns the relevance information existing in the query-shot pairs. The proposed approach, named MG-QIL, is more general and suitable for a real-world video search system as the learned relevance is independent of any queries. Specifically, MG-QIL constructs multiple graphs, including a main-graph covering all the pairs and a set of subgraphs covering the pairs within the same query. The pairs in the main-graph are connected in terms of <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">relational similarity</i> , while the pairs in the subgraphs for the same query are connected in terms of <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">attributional similarity</i> . The relevance labels are then propagated in the multiple graphs until convergence. We conducted extensive experiments on automatic search tasks over the TRECVID 2005-2007 benchmark and the results show a superior performance to state-of-the-art approaches to video search. Furthermore, when applied to video search reranking, MG-QIL can also achieve significant and consistent improvement over a text search baseline.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.