Abstract

We present a novel approach to video coding that can dramatically reduce decoder DRAM bandwidth requirements while incurring a minimal reduction in compression efficiency, and which may yield an increase in compression efficiency under certain circumstances. Our approach is based on the principle that areas of video pictures where there is high motion are typically captured with significant blur along the direction of motion. This blur permits the judicious use of reduced resolution reference pictures for prediction without significantly reducing prediction quality. Our approach makes it feasible to limit worst-case DRAM bandwidth through the use of reasonably sized on-chip caches for pixel data, which can lead to provably compliant real-time behavior. The reductions in DRAM bandwidth can be expected to yield commensurate reductions in DRAM power dissipation, and consequently improvements in battery life for mobile devices. Although we concentrate our analysis on decoders, our approach can yield even greater advantages for encoders, which require additional bandwidth for motion estimation. We show that the compression efficiency of our encoder can approach that of a standard reference encoder on natural video sequences, but that it may fall short by a modest amount on pathological synthetic sequences.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call