Abstract
Virtual Reality (VR) technology lets users train for high-stakes situations in the safety of a virtual environment (VE). Yet user movement through such an environment can cause postural instability and motion sickness. These issues are often attributed to how the brain processes visual self-motion information in VEs. Low-contrast conditions, like those caused by dense fog, are known to affect observers’ self-motion perception, but it is not clear how posture, motion sickness, and navigation performance are affected by this kind of visual environment degradation. Ongoing work using VR focuses on three aspects of this problem. First, we verify the effects in VR of low contrast on visual speed estimates. Second, we test how contrast reduction affects posture control, motion sickness, and performance during a VR navigation task. Third, we examine whether it is useful to augment low-contrast conditions with high-contrast visual aids in the environment.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have