Abstract
Depth estimation on light field images has been well researched on synthetic datasets. However, existing methods often fail on real-world light field images because of the real-world noises. Due to the lack of real-world light field image datasets for depth estimation, the learning-based methods, e.g., convolutional neural networks, are not capable of performing well on real-world light field images with only synthetic datasets available. In this paper, we adopt the optical flow estimation method to estimate depth from light field images since (i) the existing optical flow methods have shown robustness with significant real-world noises; (ii) the adopted optical flow estimation method does not require any training data, thus not limited to the synthetic datasets. The depth can be solved via the approach of optical flow estimation because (i) a sub-aperture image array can be converted from a light field image; (ii) the optical flow between every two adjacent sub-aperture images of the image array is exactly the disparity between the two images. Furthermore, since a sub-aperture array contains numerous adjacent image pairs, numerous optical flow maps can be generated from the sub-aperture array. We show the depth can be well refined from these numerous optical flow maps through modeling these maps in a single graphical model (a pairwise Gaussian MRF to be precise) and inferencing this graphical model by Gaussian belief propagation (GaBP). Experiments show that i) the proposed method achieves better depth estimation than the state-of-the-art methods on real-world light field images; ii) the proposed method can estimate the depth of objects at a far distance (3.5–9.5 m) much more accurately than the state-of-the-art convolutional neural networks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.