Abstract

In video-based dynamic point cloud compression (V-PCC), 3D point clouds are projected into patches, and then the patches are padded into 2D images suitable for the video compression framework. However, the patch projection-based method produces a large number of empty pixels; the far and near components are projected to generate different 2D images (video frames), respectively. As a result, the generated video is with high resolutions and double frame rates, so the V-PCC has huge computational complexity. This paper proposes an occupancy map guided fast V-PCC method. Firstly, the relationship between the prediction coding and block complexity is studied based on a local linear image gradient model. Secondly, according to the V-PCC strategies of patch projection and block generation, we investigate the differences of rate-distortion characteristics between different types of blocks, and the temporal correlations between the far and near layers. Finally, by taking advantage of the fact that occupancy maps can explicitly indicate the block types, we propose an occupancy map guided fast coding method, in which coding is performed on the different types of blocks. Experiments have tested typical dynamic point clouds, and shown that the proposed method achieves an average 43.66% time-saving at the cost of only 0.27% and 0.16% Bjontegaard Delta (BD) rate increment under the geometry Point-to-Point (D1) error and attribute Luma Peak-Signal-Noise-Ratio (PSNR), respectively.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.