Abstract

We introduce a stand-alone, wearable system with full body and finger tracking for first-person virtual reality (VR) avatars. The system does not rely on any external trackers or components. It comprises of a head-mounted display, inertial motion capture suit, VR gloves, and VR backpack PC. Making use of the wearable system and RUIS toolkit [1], we present an example implementation of our vision for physics-based full body avatar interaction. This envisioned interaction involves three elements from the reality-based interaction framework of Jacob et al. [2]: naïve physics, body awareness, and environment awareness. These elements lend common sense affordances within the virtual world and allow users to employ their everyday knowledge of the real world. We argue that when it comes to full body avatar interfaces, it is not only users, but also developers who benefit from utilizing physics simulation as the basis upon which different interaction techniques are built on. This physics-based approach provides intuitive manipulation and locomotion interactions without requiring individually crafted scripts. Our example implementation presents several such interactions. Furthermore, the many interaction techniques emerging from physical simulation are congruous with each other, which promotes user interface consistency. We also introduce the idea of using physics components (colliders, joints, materials, etc.) as 3D user interface building blocks, as opposed to scripting or visual programming.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call