Abstract

This paper presents an efficient method for coding quantization parameters (QPs) in video coding. Modern real-world video coders modulate QPs within a frame depending on local perceptual sensitivity for the human visual system, aiming to improve subjective quality. Frequent QP modulation increases the code amount of QPs and possibly counteracts subjective quality improvement, and therefore efficient QP coding is important. Based on the fact that similar textures have similar perceptual sensitivities for the human visual system, the proposed method predicts a QP of each coding block from spatially/temporally neighboring blocks referred in intra/ inter prediction. By leveraging coded information of intra/ inter prediction, the proposed method effectively predicts a probable QP from neighboring blocks without any additional bits. Experimental results have shown that the proposed method improves coding efficiency of QPs by from 16% to 20%, whereas the improvement with the conventional method is from 0.2% to 6.9%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call