Abstract

Among the many models for material appearance, data-driven representations like bidirectional texture functions (BTFs) play an important role as they provide accurate real-time reproduction of complex light transport effects such as interreflections. However, their acquisition involves time-consuming capturing of many thousands of bidirectional samples in order to avoid interpolation artifacts. Furthermore, high dynamic range imaging including many and long exposure steps is necessary in the presence of low albedo or self-shadowing. So far, these problems have been dealt with separately by means of sparse reconstruction and multiplexed illumination techniques, respectively. Existing methods rely on data-driven models learned on data that has been range-reduced in a way that made their simultaneous application impossible. In this paper, we address both problems at once through a novel method for learning data-driven appearance models, based on moving the dynamic range reduction from the data to the metric. Specifically, we learn models by minimizing the relative L2 error on the original data instead of the absolute L2 error on range-reduced data. We demonstrate that the models thus obtained allow for faithful reconstruction of material appearance from sparse and illumination-multiplexed measurements, greatly reducing both the number of images and the shutter times required. As a result, we are able to reduce acquisition times down to the order of minutes from what used to be the order of hours.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.