Abstract
The problem of accurate three-dimensional reconstruction is important for many research and industrial applications. Light field depth estimation utilizes many observations of the scene and hence can provide accurate reconstruction. We present a method, which enhances existing reconstruction algorithm with per-layer disparity filtering and consistency-based holes filling. Together with that we reformulate the reconstruction result to a form of point cloud from different light field viewpoints and propose a non-linear optimization of it. The capability of our method to reconstruct scenes with acceptable quality was verified by evaluation on a publicly available dataset.
Highlights
We present an extension of our depth estimation algorithm from [10]
We provide a comparison of the proposed algorithm with the state-of-the-art methods, presented in Section 2: BSL [18], FASTLFNET [20], EPI1 [16], EPI2 [12], EPINET [19], FSL [10], LF [40], LFOCC [15], OFSY [17], RM3DE [14], RPRF [41], SCGC [42], and SPO [43]
We provide results of the comparison on three metrics: the percentage of pixels where the absolute difference between the result and the ground truth is greater than the threshold, which is set to 7% (BadPix), mean square error over all pixels (MSE), and the maximum absolute disparity error of the best 25% of pixels (Q25)
Summary
Citation: Anisimov, Y.; Rambach, J.; Stricker, D. Nonlinear Optimization of Light Field Point Cloud. Sensors2022, 22, 814. https://doi.org/10.3390/s22030814Academic Editor: Denis LaurendeauReceived: 20 December 2021Accepted: 18 January 2022Published: 21 January 2022Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.Licensee MDPI, Basel, Switzerland.Attribution (CC BY) license (https://
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.