Abstract
Capturing the directions of light by light field cameras powers next-generation immersive multimedia applications. A critical problem in taking advantage of the rich visual information in light field images is depth estimation. Conventional light field depth estimation methods build a cost volume that measures the photo-consistency of pixels refocused to a range of depths, and the highest consistency indicates the correct depth. This strategy works well in most regions but usually generates blurry edges in the estimated depth map due to occlusions. Recent work shows that integrating occlusion models to light field depth estimation can largely reduce blurry edges. However, existing occlusion handling methods rely on complex edge-aided processing and post-refinement, and this reliance limits the resultant depth accuracy and impacts on the computational performance. In this paper, we propose a novel occlusion-aware vote cost (OAVC) which is able to accurately preserve edges in the depth map. Instead of using photo-consistency as an indicator of the correct depth, we construct a novel cost from a new perspective that counts the number of refocused pixels whose deviations from the central-view pixel are less than a small threshold, and utilizes that number to select the correct depth. The pixels from occluders are thus excluded in determining the correct depth. Without the use of any explicit occlusion handling methods, the proposed method can inherently preserve edges and produces high-quality depth estimates. Experimental results show that the proposed OAVC outperforms state-of-the-art light field depth estimation methods in terms of depth estimation accuracy and computational complexity.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Pattern Analysis and Machine Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.