Abstract

Throughout the last years, Intelligent Virtual Assistants (IVAs), such as Alexa and Siri, have increasingly gained in popularity. Yet, privacy advocates raise great concerns regarding the amount and type of data these systems collect and consequently process. Among many other things, it is technology trust which seems to be of high significance here, particularly when it comes to the adoption of IVAs, for they usually provide little transparency as to how they function and use personal and potentially sensitive data. While technology trust is influenced by many different socio-technical parameters, this article focuses on human personality and its connection to respective trust perceptions, which in turn may further impact the actual adoption of IVA products. To this end, we report on the results of an online survey (n=367). Findings show that on a scale from 0 to 100%, people trust IVAs 51.59% on average. Furthermore, the data point to a significant positive correlation between people’s propensity to trust in general technology and their trust in IVAs. Yet, they also show that those who exhibit a higher propensity to trust in technology tend to also have a higher affinity for technology interaction and are consequently more likely to adopt IVAs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call