Abstract

Surface normal estimation is a fundamental but challenging pixel-wise task in the field of computer vision. With the advance of convolutional neural networks, the data-driven methods have demonstrated its superiority in end-to-end predicting surface normal from a single image, rather than manually extracting geometric clues. Yet, most existing methods usually result in poorly predicted details such as the smoothness and the shape, due to the lack of pixel-wise constraints. In view of this, we propose a coarse-to-fine surface normal estimation approach, called Seman-sn, to estimate the surface normal from a single RGB image. The proposed method can adaptively joint the semantic information by neural architecture search, and then further improve the surface normal estimation in details by exploring the pixel-wise semantic constraints. Specifically, the initial predictions are first made by automatically selecting the extraction layers. Then, the pixel-wise constraint between surface normal estimation and semantic segmentation is modeled to further refine the surface normal features in the final prediction stage. The experiments have been carried on a widely used data set, and the results show the great superiority compared with the recent methods. In addition, we have proved the effectiveness of our proposed method through reasonable ablation experiments, including the quantitative results and the visualization performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.