Abstract

Mobile devices are gradually changing people's computing behaviors. However, due to the limitations of physical size and power consumption, they are not capable of delivering a 3D graphics rendering experience comparable to desktops. Many applications with intensive graphics rendering workloads are unable to run on mobile platforms directly. This issue can be addressed with the idea of remote rendering: the heavy 3D graphics rendering computation runs on a powerful server and the rendering results are transmitted to the mobile client for display. However, the simple remote rendering solution inevitably suffers from the large interaction latency caused by wireless networks, and is not acceptable for many applications that have very strict latency requirements. In this article, we present an advanced low-latency remote rendering system that assists mobile devices to render interactive 3D graphics in real-time. Our design takes advantage of an image based rendering technique: 3D image warping, to synthesize the mobile display from the depth images generated on the server. The research indicates that the system can successfully reduce the interaction latency while maintaining the high rendering quality by generating multiple depth images at the carefully selected viewpoints. We study the problem of viewpoint selection, propose a real-time reference viewpoint prediction algorithm, and evaluate the algorithm performance with real-device experiments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call