Abstract

This article presents object-oriented neural light transfer (NeLT), a novel neural representation of the dynamic light transportation between an object and the environment. Our method disentangles the global illumination of a scene into individual objects’ light transportation represented via neural networks, then composes them explicitly. It therefore enables flexible rendering with dynamic lighting, cameras, materials, and objects. Our rendering features various important global illumination effects, such as diffuse illumination, glossy illumination, dynamic shadowing, and indirect illumination, which completes the capability of existing neural object representation. Experiments show that NeLT does not require path tracing or shading results as input but achieves rendering quality comparable to state-of-the-art rendering frameworks, including the recent deep learning based denoisers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call