Abstract

We present a method that estimates the physically accurate reflectance of materials from a single image and reproduces real world materials which can be used in well-known graphics engines and tools. Recovering the BRDF (bidirectional reflectance distribution function) from a single image is an ill-posed problem due to the insufficient irradiance and geometry information as well as the insufficient samples on the BRDF parameters. The problem could be alleviated with a simplified representation of the surface reflectance such as Phong reflection model. Recent works have appealed that convolutional neural network successfully predicts parameters of empirical BRDF models for non-Lambertian surfaces. However, parameters of the physically-based model confront the problem of having non-orthogonal space, making it difficult to estimate physically meaningful results. In this paper, we propose a method to estimate parameters of a physically-based BRDF model from a single image. We focus on the metallic property of the physically-based model to enhance the estimation accuracy. Since metals and nonmetals have very different characteristics, our method processes them separately. Our method also generates auxiliary maps using a cGAN (conditional generative adversarial network) architecture to help in estimating more accurate BRDF parameters. Based on the experimental results, the auxiliary map is selected as an irradiance environment map for the metallic and a specular map for the nonmetallic. These auxiliary maps help to clarify the contributions of different actors, including light color, material color, specular component, and diffuse component, to the surface color. Our method first estimates whether the material on the input image is metallic or nonmetallic. Then, it estimates BRDF parameters using CNN (convolutional neural networks) architecture guided by generated auxiliary maps. Our results show that our method is effective to estimate BRDF parameters both on synthesized as well as real images.

Highlights

  • E XTracting material properties has been a classic computer vision and graphics problem

  • We propose a method based on generative network guided convolutional neural network to automatically estimate the physically-based Bidirectional Reflectance Distribution Function (BRDF) parameters of an isotropic single material in a spherical shape

  • We have shown that a single material image of known shape is applicable for simultaneously generating an irradiance environment map and estimating material properties

Read more

Summary

INTRODUCTION

E XTracting material properties has been a classic computer vision and graphics problem. [3] proposed a realtime approach for estimating the surface reflectance from sequential RGBD images in 90ms These prior works assume the empirical BRDF models such as Phong and Blinn-Phong models that have limited ability to represent the complex appearance of the realistic material. Empirical BRDF models have mostly straight-forward and intuitive parameters so that 3D graphic artists can define materials by adjusting them with little difficulty They have limited expressiveness that does not fully capture the physical reflection on the real material surface, especially at grazing angles. (1) We propose a deep learning method to estimate BRDF parameters of physicallybased shading model from a single image By using this method, we can synthesize realistic images with physicallybased materials captured from real-world objects. Our method concerns the twoway process of GAN [16], both illumination estimation and parameter estimation can be simultaneously performed in a single network

PHYSICALLY-BASED RENDERING PARAMETERS
ARTIST-FRIENDLY BRDF MODEL
BRDF ESTIMATION
PARAMETER ESTIMATOR
MAP GENERATOR
EVALUATION OF ENVIRONMENT AND SPECULAR MAP GENERATION
EVALUATION ON SYNTHESIZED DATASET
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.