Abstract
Recently, extensive approaches have been proposed for reducing the encoding complexity of high efficiency video coding, by predicting the coding tree unit partition using deep neural networks. However, these approaches cannot work in real time due to the complexity of the network architectures. In this paper, we propose a network pruning approach to accelerate a state-of-the-art deep neural network model, for real-time coding tree unit partition. Specifically, we first investigate the computational complexity throughout the network, and find that most calculations can be simplified by pruning the weight parameters. Considering that the number of weight parameters drastically differs by network layer and partition level, we design an adaptive pruning scheme by applying a well-suitable retention ratio of weight parameters to each layer at a level. The retention ratio indicates the ratio of weight parameters after and before pruning. By varying the retention ratios, we can obtain several accelerated network models with different levels of complexity. We further propose a complexity control algorithm by applying different accelerated models to different coding tree units, to ensure that the actual encoding complexity is close to a given target. To guarantee the rate-distortion performance, we model the complexity control algorithm as a convex optimization problem, and we can obtain a closed-form solution. Experimental results show that our approach can accelerate the original deep neural network model by 17–20 times, with little expense on the Bjontegaard delta bit-rate. For complexity control, we achieve high control accuracy with a control error of less than 2% for most video sequences.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.