Building a system to actively visualize extremely large data sets on large tiled displays in a real-time immersive environment involves a number of challenges. First, the system must be completely scalable to support the rendering of large data sets. Second, it must provide fast, constant frame rates regardless of user viewpoint or model orientation. Third, it must output the highest resolution imagery where it is needed. Fourth, it must have a flexible user interface to control interaction with the display. This paper presents the prototype for a system which meets all four of these criteria. It details the design of a wireless user interface in conjunction with two different multiresolution techniques—foveated vision and progressive image composition—to generate images on a tiled display wall. The system emphasizes the parallel, multidisplay, and multiresolution features of the Metabuffer image composition hardware architecture to produce interactive renderings of large data streams with fast, constant frame rates.
Read full abstract