Abstract

Virtual reality has become more accessible and affordable to the general public in recent years, introducing the exciting potential of this technology to new audiences. However, the mechanisms of navigating within a virtual environment have primarily been constrained to handheld input devices akin to gaming controllers. For people unfamiliar with traditional gaming input devices, VR navigation devices are not intuitively mapped to real-world modes of locomotion and can be frustrating and disorienting. Designers have largely focused on utility (the ability to efficiently accomplish a task) to the detriment of usability (ease of use). The industry lacks an intuitive, universal method of navigation that can be easily learned by novice participants. Dr. Jakob Nielsen identified five factors that impact usability in human-computer interactions (HCI): learnability, efficiency, memorability, errors, and satisfaction. Previous research in virtual environment locomotion incorporated teaching time periods where the researchers explained the control devices to participants. We believe that this neglected one of the key usability factors in human-computer interactions: learnability, or the ability and ease to accomplish a task the first time a user encounters it. Our research focuses on comparing existing modes of navigation (game controller based) with a mode of controller-less fully embodied navigation between two demographics based on Nielsen's usability factors. Existing research has demonstrated little noticeable learnability difference between modes of rotation and lean-based navigation, and joystick navigation in VR [1]; however, similar study demonstrates that partially embodied leaning mechanics can positively affect sensory perception in VR [2]. While previous Studies in controller-less VR navigation methods have demonstrated an inclination toward subject motion sickness when controllers are removed [3], other research has yielded positive qualitative results (sans motion sickness) when partially embodied alternative controller systems are used [4]. Additional research into partially embodied alternative controller systems has, in fact, indicated a preference toward existing modes of controller-based joystick navigation in VR subjects [5]; however, when partially embodied leaning mechanics are combined with another mode of sensory perception, like foot haptics, self-motion perception (vection) is enhanced [6]. In this study, we test a fully embodied mode of navigation to evaluate whether a fully engaged body experiences more positive usability results according to HCI measures. We test our results within communities of self-identified gamers and non-gamers, evaluating navigation modes designed for joystick control pads, trigger-based teleportation, and controller-less embodied navigation. Our research inquires whether embodied navigation enhances usability in accordance with Nielsen's usability factors, specifically enabling easier access and engagement for inexperienced subj e ct s, compared with controller-based modes of navigation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call