Abstract
Video is a rich information source containing both audio and visual information along with motion information embedded in it. Applications such as e-learning, live TV, video on demand, traffic monitoring, etc. need an efficient video retrieval strategy. Content-based video retrieval and superpixel segmentation are two diverse application areas of computer vision. In this work, we are presenting an algorithm for content-based video retrieval with help of Integration of Curvelet transform and Simple Linear Iterative Clustering (ICTSLIC) algorithm. Proposed algorithm consists of two steps: off line processing and online processing. In offline processing, keyframes of the database videos are extracted by employing features: Pearson Correlation Coefficient (PCC) and color moments (CM) and on the extracted keyframes superpixel generation algorithm ICTSLIC is applied. The superpixels generated by applying ICTSLIC on keyframes are used to represent database videos. On other side, in online processing, ICTSLIC superpixel segmentation is applied on query frame and the superpixels generated by segmentation are used to represent query frame. Then videos similar to query frame are retrieved through matching done by calculation of Euclidean distance between superpixels of query frame and database keyframes. Results of the proposed method are irrespective of query frame features such as camera motion, object’s pose, orientation and motion due to the incorporation of ICTSLIC superpixels as base feature for matching and retrieval purpose. The proposed method is tested on the dataset comprising of different categories of video clips such as animations, serials, personal interviews, news, movies and songs which is publicly available. For evaluation, the proposed method randomly picks frames from database videos, instead of selecting keyframes as query frames. Experiments were conducted on the developed dataset and the performance is assessed with different parameters Precision, Recall, Jaccard Index, Accuracy and Specificity. The experimental results shown that the proposed method is performing better than the other state-of-art methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.