Abstract

There has been a steady emergence of nearly identical recordings in the last several decades, thanks to the exponential development of video data. The use of regular videos has been impacted by data quality difficulties produced by near-duplicate movies, which are becoming increasingly noticeable. While there has been progress in the field of near-duplicate video detection, there is still no automated merging method for video data characterised by high-dimensional features. As a result, it is challenging to automatically clean near-duplicate videos in advance video dataset data quality. Research on removing near-duplicate video data is still in its early stages. The precision of near-duplicate video data cleaning is severely compromised by the delicate issues of video data organization besides initial clustering centres in the current research, which arise when the previous distribution is unknown. In tackle these problems, we offer a new kind of Graph Convolutional Neural Network (GCN) that uses dense influences and a categorization attention mechanism. Deeply connected graph convolutional networks (DC-GCNs) learn about faraway nodes by making GCNs deeper. By using dense connections, the DC-GCN is able to multiplex the small-scale features of shallow layers and generate features at diverse scales. Finally, an attention mechanism is incorporated to aid in feature combination and importance determination. Sparrow Search Optimisation Algorithm (SSA) is used to pick the parameters of the given model in the most optimal way. In the end, experiments are carried out using a coal mining video dataset and a widely known dataset called CC_WEB_VIDEO. The simulation findings show that the suggested strategy performs better than certain previous studies.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.