Abstract

In this paper, the innovative method for analyzing and synthesizing nonlinear dynamic textures is proposed. To capture nonlinear motion, dynamic textures are modeled by dynamic texture units. The dynamic texture units' parameters are learnt at the same time. We use Daubechies discrete wavelet transform, and different color coding i.e. YCbCr, YUV, and YIQ to get better visual quality and compact representation. This is a more accepted and a highly flexible algorithm. It exploits the spatial, temporal, and chromatic correlations amongst the pixels to get the more compact model parameters. The proposed algorithm is simple, automatic, and it works well on various types of dynamic textures. We have done the testing on various dynamic textures. It reconstructs the dynamic texture sequences with promising visual quality and fewer coefficients. The testing result shows the proposed dynamic system along with Daubechies discrete wavelet transform, and YIQ color coding achieves a higher compact model with an acceptable visual quality than the available LDS (Linear Dynamic System) and Fourier descriptor model. A compact representation for dynamic texture synthesis is very useful for embedded devices, which have a limited memory and computational power, such as tablets, mobile phones, etc.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call