Abstract

AbstractThe phase-based motion magnification, which decomposes video frames into a set of kernels, is a recent video processing technique that has been developed to perceive the undetectable subtle motions in videos within a specific frequency range. However, the parameter determination in designing the optimal kernel characteristics for video motion magnification is challenging, which needs to be addressed, in terms of the center frequency and bandwidth, especially when the structure is geometrically complex and the structural vibration modes are not well separated in the frequency domain. Their decomposition usually is determined by handcrafted designed kernels, such as the complex steerable pyramids, and Gabor wavelets, typically may not be optimally designed kernels for the extraction of subtle motions in a specific scenario. In this paper, optimal decomposition kernel is learned and designed directly from baseline dataset images acquired from existing videos using deep convolutional neural networks (CNNs) approach. Many of these responses resemble Gabor wavelet filters and Laplacian filters, which suggests that the proposed deep network learns to extract similar information as done by the complex steerable filters. By contrast, the texture kernel responses show many blurring kernels.KeywordsConvolutional neural networksDeep learningTransfer learningModal analysisImage processingPhase-based motion magnificationModal analysis

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.