Abstract

Depth sensing devices enabled with an RGB camera, can be used to augment conventional images with depth information on a per-pixel basis. Currently available RGB-D sensors include the Asus Xtion Pro, Microsoft Kinect and Intel RealSense <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">™</sup> . However, these sensors have certain limitations. Objects that are shiny, transparent or have an absorbing matte surface, create problems due to reflection. Also, there can be an interference in the IR pattern due to the use of multiple RGB-D cameras and the depth information is correctly interpreted only for short distances between the camera and the object. The proposed system, block matching stereo vision (BMSV) uses an RGB-D camera with rectified/non-rectified block matching and image pyramiding along with dynamic programming for human tracking and capture of accurate depth information from shiny/transparent objects. Here, the IR emitter generates a known IR pattern and the depth information is recovered by comparing the multiple views of the focused object. The depth map of the BMSV RGB-D camera and the resultant disparity map are fused. This fills any void regions that may have emerged due to interference or because of the reflective transparent surfaces and an enhanced dense stereo image is obtained. The proposed method is applied to a 3D realistic head model, a functional magnetic resonance image (fMRI) and the results are presented. Results showed an improvement in speed and accuracy of RGB-D sensors which in turn provided accurate depth information density irrespective of the object surface.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.