Abstract
AbstractWith the development of deep connections between humans and Artificial Intelligence voice‐based assistants (VAs), human and machine relationships have transformed. For relationships to work it is essential for trust to be established. Although the capabilities of VAs offer retailers and consumers enhanced opportunities, building trust with machines is inherently challenging. In this paper, we propose integrating Human–Computer Interaction Theories and Para‐Social Relationship Theory to develop insight into how trust and attitudes toward VAs are established. By adopting a mixed‐method approach, first, we quantitatively examine the proposed model using Covariance‐Based Structural Equation Modeling on 466 respondents; based on the findings of this study, a second qualitative study is employed to reveal four main themes. Findings show that while functional elements drive users' attitude toward using VAs, the social attributes, being social presence and social cognition, are the unique antecedents for developing trust. Additionally, the research illustrates a peculiar dynamic between privacy and trust and it shows how users distinguish two different sources of trustworthiness in their interactions with VAs, identifying the brand producers as the data collector. Taken together, these results reinforce the idea that individuals interact with VAs treating them as social entities and employing human social rules, thus supporting the adoption of a para‐social perspective.
Highlights
Why does she [Alexa] always listen to me? Voice technology usage is rising worldwide, with almost 4.2 billion voice‐activated assistants (VAs) being used in devices around the world in the last year (Statista, 2020)
Olson (2019) highlights that almost 41% of VAs users are concerned about privacy and passive listening, and trust has been identified as the main barrier for voice assistants' users and shoppers (PwC, 2019)
The results show that Perceived Ease of Use (PEOU) and social cognition, in terms of perceived competence, both have strong positive effects on trust and attitude
Summary
Voice technology usage is rising worldwide, with almost 4.2 billion voice‐activated assistants (VAs) being used in devices around the world in the last year (Statista, 2020). While VAs' adoption is advancing quickly, its usage remains limited to basic tasks. Olson (2019) highlights that almost 41% of VAs users are concerned about privacy and passive listening, and trust has been identified as the main barrier for voice assistants' users and shoppers (PwC, 2019). Trust in the technology is among the obstacles that can cause worry and concerns for current VAs adopters and, hinder their full adoption in the near future (Marcus, 2019; Rossi, 2019). While the role of trust on VAs adoption is relevant to both practitioners and Psychology & Marketing. While the role of trust on VAs adoption is relevant to both practitioners and Psychology & Marketing. 2021;1–17
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.