Abstract

As one of the most popular AI applications, chatbots are creating new ways and value for businesses to interact with their customers, and their adoption and continued use will depend on users’ trust. However, due to the non-transparent of AI-related technology and the ambiguity of application boundaries, it is difficult to determine which aspects enhance the adaptation of chatbots and how they interactively affect human trust. Based on the theory of task-technology fit, we developed a research model to investigate how two conversational cues of chatbots, human-like cues and tailored responses, influence human trust toward chatbots and to explore appropriate boundary conditions (individual characteristics and task characteristics) in interacting with chatbots. One survey and two experiments were performed to test the research model, and the results indicated that (1) perceived task solving competence and social presence mediate the pathway from conversational cues to human trust, which was validated in the context of e-commerce and education; (2) the extent of users’ ambiguity tolerance moderates the effects of two conversational cues on social presence; and (3) when performing high-creative tasks, the human-like chatbot induces higher perceived task solving competence. Our findings not only contribute to the AI trust-related literature but also provide practical implications for the development of chatbots and their assignment to individuals and tasks.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.