Abstract

This paper presents a context aware model for situated analytics, supporting a blended user interface. Our approach is a state-based model, allowing seamless transition between the physical space and information space during use. We designed the model to allow common user interface controls to work in tandem with the printed information on a physical object by adapting the operation and presentation based on a semantic matrix. We demonstrate the use of the model with a set of blended controls including; pinch zoom, menus, and details-on-demand. We analyze each control to highlight how the physical and virtual information spaces work in tandem to provide a rich interaction environment in augmented reality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call