Abstract
Due to the numerous applications of boundary maps and occlusion orientation maps (ORI-maps) in high-level vision problems, accurate estimation of these maps is a crucial task. The existing deep networks employ a single-stream network to estimate the relation between boundary map and ORI-map estimation. However, these networks fail to explore significant individual information separately. To resolve this problem, in this paper, we propose a novel two-stream generative adversarial network (GAN) for boundary map and ORI-map estimation, named OBP-GAN. The proposed OBP-GAN consists of two streams known as BP-GAN and OR-GAN. The BP-GAN estimates the boundary map, and the OR-GAN predicts the ORI-map. The boundary and ORI-map can also be useful cues for the task of depth-map refinement from single images. Therefore, in this work, we propose a transformer-based depth-map refinement network (TRANSDMR-GAN) for refining the depth estimated from monocular images using boundary and ORI-map. We conducted extensive analyses on indoor and outdoor datasets to validate our proposed OBP-GAN and TRANSDMR-GAN. The extensive experimental analysis and ablation study demonstrate the ability of the proposed OBP-GAN to generate state-of-the-art occlusion boundary maps. Furthermore, we show that the proposed network, TRANSDMR-GAN, can generate an edge-enhanced depth map without degrading the accuracy of the initial depth map.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: ACM Transactions on Multimedia Computing, Communications, and Applications
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.