Abstract

Conversational agents have transcended into multiple industries with increased ability for user engagement in intelligent conversation. Conversations with chatbots are different from interpersonal communication in terms of turn-taking, intentions, and behavior. We study de-identified chat logs across 30 conversations with a well-recognized chatbot to understand (i) how people create turns in conversation to perform 'social action', extending human experiences and knowledge (ii) how people express typical human constructs like emotion in their interaction with chatbots, and, (iii) what are the discursive strategies used by people to create 'shared meaning' and identity for themselves. Our findings reveal conversational expectations and behavior of users being similar to those in human-to-human sharing (how people talk), but greater diversity in the nature of information shared (what they talk about). This can advance discussion both in how we can design conversational agents to be more intelligible, as well as sensitive to unnecessary information disclosure.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.