Abstract

Importance sampling techniques significantly reduce variance in physically-based rendering. In this paper we propose a novel online framework to learn the spatial-varying distribution of the full product of the rendering equation, with a single small neural network using stochastic ray samples. The learned distributions can be used to efficiently sample the full product of incident light. To accomplish this, we introduce a novel closed-form density model, called the Normalized Anisotropic Spherical Gaussian mixture, that can model a complex light field with a small number of parameters and that can be directly sampled. Our framework progressively renders and learns the distribution, without requiring any warm-up phases. With the compact and expressive representation of our density model, our framework can be implemented entirely on the GPU, allowing it to produce high-quality images with limited computational resources. The results show that our framework outperforms existing neural path guiding approaches and achieves comparable or even better performance than state-of-the-art online statistical path guiding techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call