Abstract

We investigated gaze direction determination in dyadic interactions mediated by an Augmented Reality (AR) head-mounted-display. With AR, virtual content is overlaid on top of the real-world scene, offering unique data visualization and interaction opportunities. A drawback of AR however is related to uncertainty regarding the AR user’s focus of attention in social-collaborative settings: an AR user looking in our direction might either be paying attention to us or to augmentations positioned somewhere in between. In two psychophysical experiments, we assessed what impact assumptions concerning the positioning of virtual content attended by an AR user have on other people’s sensitivity to their gaze direction. In the first experiment we found that gaze discrimination was better when the participant was aware that the AR user was focusing on stimuli positioned on their depth plane as opposed to being positioned halfway between the AR user and the participant. In the second experiment, we found that this modulatory effect was explained by participants’ assumptions concerning which plane the AR user was focusing on, irrespective of these being correct. We discuss the significance of AR reduced gaze determination in social-collaborative settings as well as theoretical implications regarding the impact of this technology on social behaviour.

Highlights

  • Gaze behaviours carry important nonverbal information that inform and regulate interactions between individuals[1,2,3]

  • We observed that discrimination performance improved only when participants believed that stimuli were displayed on the Far plane, irrespective of this assumption being correct, demonstrating that a subjective expectation regarding the positioning of virtual content attended by the Actor modulated gaze direction sensitivity

  • Differences in SD values revealed that participants were more sensitive to gaze direction information when the Actor fixated on stimuli situated on the same plane occupied by the Observer (Far plane) (Fig. 1c,d)

Read more

Summary

Introduction

Gaze behaviours carry important nonverbal information that inform and regulate interactions between individuals[1,2,3]. We observed that discrimination performance improved only when participants believed that stimuli were displayed on the Far plane, irrespective of this assumption being correct, demonstrating that a subjective expectation regarding the positioning of virtual content attended by the Actor (i.e. whether the Actor fixated on stimuli positioned halfway between the pair or on the same plane occupied by the Observer) modulated gaze direction sensitivity These findings have theoretical implications in our understanding of the impact of technology on social behaviour, showing how sources of sensory uncertainty that accompany the use of AR-HMDs can impact gaze interactions. These findings can provide insights for the design of AR interfaces that reduce these sources of visual uncertainty

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call