Abstract
Appearance preservation aims to estimate reflectance functions to model the way real materials interact with light. These functions are especially useful in digital preservation of heritage and realistic rendering, as they reproduce the appearance of real materials in virtual scenes. This work proposes an image-based process that aims to preserve the appearance of surfaces whose reflectance properties are spatially variant. During image acquisition, this process considers the whole environment as a source of light over the area to be preserved and, assuming the environment is static, it does not require controlled environments. To achieve this goal, the scene geometry and relative camera positions are approximated from a set of HDR images taken inside the real scene, using a combination of structure from motion and multi-view stereo methods. Based on this data, a set of unstructured lumigraphs is traced, on-demand, inside the reconstructed scene. The color information retrieved from these lumigraphs is then used to estimate a linear combination of basis BRDFs for a grid of points in the surface area, defining thus its SVBRDF. This paper details the proposed method and presents the results obtained using real and synthetic settings. It shows that considering the whole environment as a source of light is a viable approach to obtain reliable results and to enable more flexible acquisition setups.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.