Abstract

We present NeX, a new approach to novel view synthesis based on enhancements of multiplane images (MPI) that can reproduce view-dependent effects in real time. Unlike traditional MPI, our technique parameterizes each pixel as a linear combination of spherical basis functions learned from a neural network to model view-dependent effects and uses a hybrid implicit-explicit modeling strategy to improve fine detail. Moreover, we also present an extension to NeX, which leverages knowledge distillation to train multiple MPIs for unbounded 360 ° scenes. Our method is evaluated on several benchmark datasets: NeRF-Synthetic dataset, Light Field dataset, Real Forward-Facing dataset, Space dataset, as well as Shiny, our new dataset that contains significantly more challenging view-dependent effects, such as the rainbow reflections on the CD. Our method outperforms other real-time rendering approaches on PSNR, SSIM, and LPIPS and can render unbounded 360 ° scenes in real time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call