Abstract

How vision guides gaze in realistic settings has been researched for decades. Human gaze behavior is typically measured in laboratory settings that are well controlled but feature-reduced and movement-constrained, in sharp contrast to real-life gaze control that combines eye, head, and body movements. Previous real-world research has shown environmental factors such as terrain difficulty to affect gaze; however, real-world settings are difficult to control or replicate. Virtual reality (VR) offers the experimental control of a laboratory, yet approximates freedom and visual complexity of the real world (RW). We measured gaze data in 8 healthy young adults during walking in the RW and simulated locomotion in VR. Participants walked along a pre-defined path inside an office building, which included different terrains such as long corridors and flights of stairs. In VR, participants followed the same path in a detailed virtual reconstruction of the building. We devised a novel hybrid control strategy for movement in VR: participants did not actually translate: forward movements were controlled by a hand-held device, rotational movements were executed physically and transferred to the VR. We found significant effects of terrain type (flat corridor, staircase up, and staircase down) on gaze direction, on the spatial spread of gaze direction, and on the angular distribution of gaze-direction changes. The factor world (RW and VR) affected the angular distribution of gaze-direction changes, saccade frequency, and head-centered vertical gaze direction. The latter effect vanished when referencing gaze to a world-fixed coordinate system, and was likely due to specifics of headset placement, which cannot confound any other analyzed measure. Importantly, we did not observe a significant interaction between the factors world and terrain for any of the tested measures. This indicates that differences between terrain types are not modulated by the world. The overall dwell time on navigational markers did not differ between worlds. The similar dependence of gaze behavior on terrain in the RW and in VR indicates that our VR captures real-world constraints remarkably well. High-fidelity VR combined with naturalistic movement control therefore has the potential to narrow the gap between the experimental control of a lab and ecologically valid settings.

Highlights

  • The question what guides our gaze in realistic settings has been of interest to researchers for decades

  • Itti et al (1998) adapted Koch and Ullman (1985) “saliency map” to predict fixated locations in a natural scene, many models followed the idea to combine image features using increasingly sophisticated schemes or optimality principles (e.g., Bruce and Tsotsos, 2006; Harel et al, 2006; Zhang et al, 2008; Garcia-Diaz et al, 2012). As such models presumably built-in some implicitobject representation and objects are crucial for gaze guidance (Stoll et al, 2015), it comes as no surprise that models that use deep neural networks that share their lower-levels with object recognition models (e.g., Kümmerer et al, 2015), have become most successful and close to the theoretical image-computable optimum in predicting gaze during free viewing of natural scenes

  • The mean saccade rates were computed for each participant: 2.03 ± 0.35 s−1 for the Virtual reality (VR) (Corridors: 1.64 ± 0.29 s−1, Ascending stairs: 2.28 ± 0.49 s−1, Descending stairs: 2.21 ± 0.70 s−1) and 3.46 ± 0.18 s−1 for the real world (RW)

Read more

Summary

INTRODUCTION

The question what guides our gaze in realistic settings has been of interest to researchers for decades. At the same time, when aiming for general results beyond a specific application setting – such as sports (e.g., Land and McLeod, 2000; Hayhoe et al, 2012, for a review see Kredel et al, 2017), interface design (Thoma and Dodd, 2019), customer evaluation (Zhang et al, 2018) or driving (Land, 1992; Chapman and Underwood, 1998; Kapitaniak et al, 2015) to name just a few areas where eye-tracking has become a widely used tool – the degree of experimental control in a real-world setting is severely limited This may become even more crucial when specific tasks such as search shall be studied, rather than free exploration or free viewing. Under the hypothesis that VR faithfully approximates the RW, we expect main effects of the factor sector, but no interaction between sector and world for dependent variables characterizing relevant aspects of gaze allocation

MATERIALS AND METHODS
Experimental Setup and Gaze Recording
Procedure and Participants
RESULTS
DISCUSSION
ETHICS STATEMENT
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call