Abstract

Robot design to simulate interpersonal social interaction is an active area of research with applications in therapy and companionship. Neural responses to eye-to-eye contact in humans have recently been employed to determine the neural systems that are active during social interactions. Whether eye-contact with a social robot engages the same neural system remains to be seen. Here, we employ a similar approach to compare human-human and human-robot social interactions. We assume that if human-human and human-robot eye-contact elicit similar neural activity in the human, then the perceptual and cognitive processing is also the same for human and robot. That is, the robot is processed similar to the human. However, if neural effects are different, then perceptual and cognitive processing is assumed to be different. In this study neural activity was compared for human-to-human and human-to-robot conditions using near infrared spectroscopy for neural imaging, and a robot (Maki) with eyes that blink and move right and left. Eye-contact was confirmed by eye-tracking for both conditions. Increased neural activity was observed in human social systems including the right temporal parietal junction and the dorsolateral prefrontal cortex during human-human eye contact but not human-robot eye-contact. This suggests that the type of human-robot eye-contact used here is not sufficient to engage the right temporoparietal junction in the human. This study establishes a foundation for future research into human-robot eye-contact to determine how elements of robot design and behavior impact human social processing within this type of interaction and may offer a method for capturing difficult to quantify components of human-robot interaction, such as social engagement.

Highlights

  • In the era of social distancing, the importance of direct social interaction to personal health and well-being has never been more clear (Brooke and Jackson, 2020; Okruszek et al, 2020)

  • Two participants appeared to make less eye-contact with their robot partner than they did with their human partner, though both still showed task compliance with the robot

  • Eye-contact with another human contrasted with baseline (Supplemenatry Figure S1, left) resulted in significant activity in the rTPJ

Read more

Summary

Introduction

In the era of social distancing, the importance of direct social interaction to personal health and well-being has never been more clear (Brooke and Jackson, 2020; Okruszek et al, 2020). Eye-toeye contact is just one example of a behavior that carries robust social implications which impact the outcome of an interaction via changes to the emotional internal state of those engaged in it (Kleinke, 1986). Robots occupy a unique space on the spectrum from object to agent which can be manipulated through robot behavior, perception, and reaction, as well as the development of expectations and beliefs regarding the robot within the human interacting partner Due to these qualities, robots are a valuable tool for parsing elements of human neural processing and comparison of the processing of humans and of robots as social partners is a fruitful ground for discovery (Rauchbauer et al, 2019; Sciutti et al, 2012a)

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call