Abstract
In this paper, we propose a dual diversified dynamical Gaussian process latent variable model ( [Formula: see text]GPLVM) to tackle the video repairing issue. For preservation purposes, videos have to be conserved on media. However, storing on media, such as films and hard disks, can suffer from unexpected data loss, for instance, physical damage. So repairing of missing or damaged pixels is essential for better video maintenance. Most methods seek to fill in missing holes by synthesizing similar textures from local patches (the neighboring pixels), consecutive frames, or the whole video. However, these can introduce incorrect contexts, especially when the missing hole or number of damaged frames is large. Furthermore, simple texture synthesis can introduce artifacts in undamaged and recovered areas. To address aforementioned problems, we introduce two diversity encouraging priors to both of inducing points and latent variables for considering the variety in existing videos. In [Formula: see text]GPLVM, the inducing points constitute a smaller subset of observed data, while latent variables are a low-dimensional representation of observed data. Since they have a strong correlation with the observed data, it is essential that both of them can capture distinct aspects of and fully represent the observed data. The dual diversity encouraging priors ensure that the trained inducing points and latent variables are more diverse and resistant for context-aware and artifacts-free-based video repairing. The defined objective function in our proposed model is initially not analytically tractable and must be solved by variational inference. Finally, experimental testing results illustrate the robustness and effectiveness of our method for damaged video repairing.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.