Abstract

Nowadays color-guided Depth map Super-Resolution (DSR) methods mainly have three thorny problems: (1) joint DSR methods have serious detail and structure loss at very high sampling rate; (2) existing DSR networks have high computational complexity; (3) color-depth inconsistency makes it hard to fuse dual-modality features. To resolve these problems, we propose a joint hybrid-cross guidance filter method to progressively recover the quality of degraded Low-Resolution (LR) depth maps by exploiting color-depth consistency from multiple perspectives. Specifically, the proposed method leverages pyramid structure to extract multi-scale features from High-Resolution (HR) color image. At each scale, hybrid side window filter block is proposed to achieve high-efficiency color feature extraction after each down-sampling for HR color image. This block is also used to extract depth features from the LR depth map. Meanwhile, we propose a multi-perspective cross-guided fusion filter block to progressively fuse high-quality multi-scale structure information of color image with corresponding enhanced depth features. In this filter block, two kinds of space-aware group-compensation modules are introduced to capture various spatial features from different perspectives. Meanwhile, color-depth cross-attention module is proposed to extract color-depth consistency features for impactful boundary preservation. Comprehensively qualitative and quantitative experimental results have demonstrated that our method can achieve superior performances against a lot of state-of-the-art depth SR approaches in terms of mean absolute deviation and root mean square error on Middlebury, NYU-v2 and RGB-D-D datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.