Abstract

3D video applications are easily accessible to consumers in this era. In general, high quality depth maps are crucial for interpolating virtual view images that are vital in 3D video applications. Many applications adopt time-of-flight (ToF) cameras due to their real-time distance capturing. Yet, the raw ToF image exhibit limitations such as inaccurate distance values in object boundaries and areas that can absorb the infrared ray. This paper presents a method that handles this issue. The existing methods do not focus on this problem. The proposed method is tested on a camera system that consists of a color camera and a ToF camera. Assuming immersive applications such as teleconferencing or virtual broadcasting, the object of interest is captured. The errors existing in the ToF images are modified as a preprocessing step. Afterward, the low-resolution ToF image is warped to the viewpoint of the color camera; empty areas are filled considering the neighbor information. The depth map results show improvements over the state-of-the-art method. The proposed method can increase the overall quality of the 3D video system, ultimately making 3D video consumer applications more marketable.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.