Abstract

Ambivalence, the simultaneous experience of both positive and negative feelings about one and the same attitude object, has been investigated within psychological attitude research for decades. Ambivalence is interpreted as an attitudinal conflict with distinct affective, behavioral, and cognitive consequences. In social psychological research, it has been shown that ambivalence is sometimes confused with neutrality due to the use of measures that cannot distinguish between neutrality and ambivalence. Likewise, in social robotics research the attitudes of users are often characterized as neutral. We assume that this is due to the fact that existing research regarding attitudes towards robots lacks the opportunity to measure ambivalence. In the current experiment (N = 45), we show that a neutral and a robot stimulus were evaluated equivalently when using a bipolar item, but evaluations differed greatly regarding self-reported ambivalence and arousal. This points to attitudes towards robots being in fact highly ambivalent, although they might appear neutral depending on the measurement method. To gain valid insights into people’s attitudes towards robots, positive and negative evaluations of robots should be measured separately, providing participants with measures to express evaluative conflict instead of administering bipolar items. Acknowledging the role of ambivalence in attitude research focusing on robots has the potential to deepen our understanding of users’ attitudes and their potential evaluative conflicts, and thus improve predictions of behavior from attitudes towards robots.

Highlights

  • Would you like to have a social robot at home? You may have mixed feelings about an artificial intelligence system in your private space

  • We extend the generalizability of the ambivalence construct to the domain of robotics

  • By comparing bipolar measures with multi-dimensional measures in the current study we demonstrate that this distinction between ambivalence and neutrality is relevant in robotics research

Read more

Summary

Introduction

Would you like to have a social robot at home? You may have mixed feelings about an artificial intelligence system in your private space. Would you like to have a social robot at home? On the one hand, owning a robot may sound promising. Social robots can be entertaining, while they might assist with chores or serve as companions. On the other hand, such technology gives rise to privacy and security concerns which are commonly associated with mobile, cloud-connected devices making use of a camera or a microphone [1]. You might feel both positively and negatively about having a robot at home, and this makes the question of buying a robot discomforting and difficult to answer.

Objectives
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.