Abstract

Many applications, such as computer-aided design and game rendering, need to reproduce realistic material appearance in complex light environment and different visual conditions. The authenticity of the three-dimensional object or the scene is heavily depended on the representation and rendering of textures, where the Bidirectional Texture Function (BTF) is one of the most widely-used texture models. In this paper, we proposed a neural network to learn the representation of the BTF data for predicting new texture images under novel conditions. The proposed method was tested on a public BTF dataset and was shown to produce satisfactory synthetic results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call