Abstract

Estimating the relative depth of a single image (Monocular Depth Estimation) is a significant step towards understanding the general structure of the depicted scenery, the relations of entities in the scene and their interactions. When faced with the task of estimating the depth of a scene without the use of Stereo images, we are dependent on the availability of large-scale depth datasets and high-capacity models to capture the intrinsic nature of depth. Unfortunately, creating large-scale datasets of depth images is not a trivial task. To overcome this limitation, In this work, a new approach is proposed to accumulate Depth & Surface Normal datasets from the world of different Video Games in an easy and reproducible way. This work also introduces a new loss function to better incorporate the relation between the Depth and the Surface Normal of a scene which results in higher quality depth estimations that also produce more uniform surface normals. Qualitative and quantitative comparisons are provided between the proposed method and the best approaches of the last nine years and have competitive performance to the State of the Art by having ≈24 times less parameters for the proposed method. To further prove the effectiveness of this approach, an Ablation Study is also provided. Experiments on this dataset shows that using this new loss function alongside synthetic datasets increases the accuracy of ”Monocular Depth Estimation in the Wild” tasks where other approaches usually fail to generalize.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.