Abstract

To obtain reliable depth images with high resolution, a novel method is proposed in this study that fuses data acquired from time-of-flight (ToF) and stereo cameras, through which the advantages of both active and passive sensing are utilised. Based on the classic error model of the ToF, gradient information is introduced to establish the likelihood distribution for all disparity candidates. The stereo likelihood is estimated in parallel based on a 3D adaptive support-weight approach. The two independent likelihoods are unified using a maximum likelihood estimation, a process also referred to as a joint depth filter herein. Conventional post-processing methods such as a mutual consistency check are also used after applying a joint depth filter. We also propose a novel hole-filling method based on the seed-growing algorithm to retrieve missing disparities. Experiment results show that the proposed fusion method can produce reliable high-resolution depth maps and outperforms other compared methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call