Abstract

It has been shown that rendering in the gradient domain, i.e., estimating finite difference gradients of image intensity using correlated samples, and combining them with direct estimates of pixel intensities by solving a screened Poisson problem, often offers fundamental benefits over merely sampling pixel intensities. The reasons can be traced to the frequency content of the light transport integrand and its interplay with the gradient operator. However, while they often yield state of the art performance among algorithms that are based on Monte Carlo sampling alone, gradient-domain rendering algorithms have, until now, not generally been competitive with techniques that combine Monte Carlo sampling with post-hoc noise removal using sophisticated non-linear filtering. Drawing on the power of modern convolutional neural networks, we propose a novel reconstruction method for gradient-domain rendering. Our technique replaces the screened Poisson solver of previous gradient-domain techniques with a novel dense variant of the U-Net autoencoder, additionally taking auxiliary feature buffers as inputs. We optimize our network to minimize a perceptual image distance metric calibrated to the human visual system. Our results significantly improve the quality obtained from gradient-domain path tracing, allowing it to overtake state-of-the-art comparison techniques that denoise traditional Monte Carlo samplings. In particular, we observe that the correlated gradient samples --- that offer information about the smoothness of the integrand unavailable in standard Monte Carlo sampling --- notably improve image quality compared to an equally powerful neural model that does not make use of gradient samples.

Highlights

  • Realistic image synthesis seeks to produce realistic virtual photographs by computationally solving the Rendering Equation [Kajiya 1986], often by randomly sampling paths that carry light from the light sources to the sensor

  • We seek to combine the power of the natural smoothness information provided by gradient samples with the high performance of modern denoising techniques that draw on auxiliary features, and aim to do this in a way that adapts to the myriad of different light transport configurations

  • To incorporate auxiliary feature information, e.g., depth, normal, albedo, without designing smoothness priors to weight the terms in the equations by hand, and to support optimizing the result in terms of complex, non-linear perceptual image distance metrics, we take a different route and cast the reconstruction into a direct regression problem solved by a convolutional neural network (CNN)

Read more

Summary

INTRODUCTION

Realistic image synthesis seeks to produce realistic virtual photographs by computationally solving the Rendering Equation [Kajiya 1986], often by randomly sampling paths that carry light from the light sources to the sensor. Depending on the choice of norm in which the screened Poisson equation is solved, the techniques remain unbiased (L2) or consistent (L1), while still exploiting smoothness for better reconstruction, but without relying on heuristics based on auxiliary helper variables. We seek to combine the power of the natural smoothness information provided by gradient samples with the high performance of modern denoising techniques that draw on auxiliary features, and aim to do this in a way that adapts to the myriad of different light transport configurations. Our reconstruction significantly improves the quality of reconstructed shadows despite working with a much lower sample count in equal time comparisons (Figure 1)

RELATED WORK
Denoising
Gradient-Domain Rendering
Neural Networks
Perceptual Losses
NEURAL RECONSTRUCTION
Network Architecture
The Loss Function and Dynamic Range
Network Details
RESULTS AND DISCUSSION
Test Scenes
Metrics
Overall Performance and Analysis
Ablations and Directed Tests
Training Set and Generalization
Implementation Details
CONCLUSIONS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call