Abstract

Developments in 3D computer vision have advanced scene understanding and 3D modeling. Surface normal estimation is a basic task in these fields. In this paper, we propose a geometry-guided multilevel fusion scheme for high-quality surface normal estimation by exploiting texture and geometry information from color and depth images. The surface normal is progressively predicted with a coarse-to-fine strategy. First, an initial surface normal (IniNormal) Nini is predicted by a hierarchical confidence reweighting convolution neural network to merge texture and geometry information in a CNN feature level. Although a general accuracy is achieved, the long tail problem makes the IniNormal always fails in special areas where the depth map is high-quality while the intensity interference is challenging, such as repeating textures and abnormal exposures. Further, a traditional geometry-consistent based surface normal(GeoNormal) Ngeo is calculated based on traditional constraints, and a surface normal level fusion module is designed to remap the depth to different representations and reconsider scene information. Then, the final clear surface normal N is estimated by adaptively reintegrating the IniNormal and GeoNormal in a decision level. To overcome disturbances in the dataset and ensure the trainability of the network, a carefully designed hybrid objective function and an annealed term are applied. An explainable analysis is attached. The experimental results on two benchmark datasets demonstrate that the proposed GMLF(geometry-guided multilevel RGBD fusion for surface normal estimation) can achieve better quantitative and qualitative performance. The proposed method may be useful for robots and auto-driving which can be applied in the next-generation Internet-of-Things (NG-IoT).

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.