Abstract

In movie transmission, video frames are subject to loss due to noise and/or congestion. The loss of video frames could cause a loss of synchronization between the audio and video streams. If not corrected, this cumulative loss can seriously degrade the motion picture's quality beyond viewers' tolerance. Furthermore, many Internet applications providing video and audio streams have to operate in a limited resource environment. The luxury of time and abundant resources is not present, especially for applications like video conferencing and movie provision on demand. In this paper, we develop and study real time motion-based and interpolation-based techniques for performing synchronization-sensitive frame estimation using existing received frames only, without the need for retransmissions or error control information. The estimated frames are then injected at their appropriate locations in the movie stream in order to bring back the synchronization within the tolerance level of viewers. We attempt to find a very close estimation to the original frames at a suitable computation cost. We utilize a normalized luminance model for tracking motion between received frames in the presence of lost data. We ultimately use the generated motion vectors to reconstruct missing data in frames, and to restore synchronization loss between transmitted audio and video streams. We also develop quadratic and linear interpolations for the same purpose. In addition, hybrids of such techniques were developed to best utilize the performance of each developed frame estimation technique under varying degrees of losses and different scene types. Using our techniques, we were able to fully restore synchronization between audio and video streams, with a corresponding estimation of data loss. Both objective and subjective evaluations were performed on the achieved results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call