Abstract
Texture mapping as a fundamental task in 3D modeling has been well established for well-acquired aerial assets under consistent illumination, yet it remains a challenge when it is scaled to large datasets with images under varying views and illuminations. A well-performed texture mapping algorithm must be able to efficiently select views, fuse and map textures from these views to mesh models, at the same time, achieve consistent radiometry over the entire model. Existing approaches achieve efficiency either by limiting the number of images to one view per face, or simplifying global inferences to only achieve local color consistency. In this paper, we break this tie by proposing a novel and efficient texture mapping framework that allows the use of multiple views of texture per face, at the same time to achieve global color consistency. The proposed method leverages a loopy belief propagation algorithm to perform an efficient and global-level probabilistic inferences to rank candidate views per face, which enables face-level multi-view texture fusion and blending. The texture fusion algorithm, being non-parametric, brings another advantage over typical parametric post color correction methods, due to its improved robustness to non-linear illumination differences. The experiments on three different types of datasets (i.e. satellite dataset, unmanned-aerial vehicle dataset and close-range dataset) show that the proposed method has produced visually pleasant and texturally consistent results in all scenarios, with an added advantage of consuming less running time as compared to the state of the art methods, especially for large-scale dataset such as satellite-derived models.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.