Abstract

Although Augmented Reality applications are becoming increasingly popular, the lack of visual realism in rendering still remains an open problem due to its computational cost. Physically-based algorithms can generate renderings with a high degree of photorealism, and they are becoming popular after the recent development of hardware accelerators. This work shows how to integrate Augmented Reality frameworks with ray tracing frameworks to create scenes with high-quality, real-time reflections and refractions, with emphasis on the blending of virtual objects to the real environment. To support the interaction between real and virtual elements, a textured cube using images from the real environment must be provided. Our framework does not add processing overhead to the application when comparing the use of the proposed middleware to the use of ray tracing frameworks alone. We will show that with our approach, photorealistic augmented reality rendering can be achieved in real time without the use of any special equipment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call