What if we could visualize and interact with information directly in the context of our surroundings? Our research group is exploring how augmented reality (AR) could someday make this possible. AR integrates a complementary virtual world with the physical world-for example, by using head-tracked see-through head-worn displays to overlay graphics on what we see. Instead of looking back and forth between the real world and a PDA, we look directly at the real world and the virtual information overlaid on it. At the heart of this approach is context-aware computing, computing systems that are sensitive to the context in which they operate, ranging from human relationships to physical location. For example, information might be tied to specific locations within a global, Earth-centered, coordinate system. How can we design effective mobile AR user interfaces? We've been trying to answer this question in part by developing experimental AR research prototypes. In AR, as in work on information visualization using desktop technologies, the amount of information available can far exceed what a system can legibly display at a given time, necessitating information filtering. Julier et al. (2000) have developed information filtering techniques for AR that depend on the user's goals, object importance, and proximity. We assume that a system can accomplish information filtering of this sort and that our system is displaying everything it should.
Read full abstract