Abstract

3D video applications are easily accessible to consumers in this era. In general, high quality depth maps are crucial for interpolating virtual view images that are vital in 3D video applications. Many applications adopt time-of-flight (ToF) cameras due to their real-time distance capturing. Yet, the raw ToF image exhibit limitations such as inaccurate distance values in object boundaries and areas that can absorb the infrared ray. This paper presents a method that handles this issue. The existing methods do not focus on this problem. The proposed method is tested on a camera system that consists of a color camera and a ToF camera. Assuming immersive applications such as teleconferencing or virtual broadcasting, the object of interest is captured. The errors existing in the ToF images are modified as a preprocessing step. Afterward, the low-resolution ToF image is warped to the viewpoint of the color camera; empty areas are filled considering the neighbor information. The depth map results show improvements over the state-of-the-art method. The proposed method can increase the overall quality of the 3D video system, ultimately making 3D video consumer applications more marketable.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call