Abstract
Abstract Chatbots are very much an emerging technology, and there is still much to learn about how conversational user interfaces will affect the way in which humans communicate not only with computers but also with one another. Further studies on anthropomorphic agents and the projection of human characteristics onto a system are required to further develop this area. Gender stereotypes operate a profound effect on human behaviour. The application of gender to a conversational agent brings along with it the projection of user biases and preconceptions. These feelings and perceptions about an agent can be used to develop mental models of a system. Users can be inclined to measure the success of a system based on their biases and emotional connections with the agent rather than that of the system’s performance. There have been many studies that show how gender affects human perceptions of a conversational agent. However, there is limited research on the effect of gender when applied to a chatbot system. This chapter presents early results from a research study which indicate that chatbot gender does have an effect on users overall satisfaction and gender-stereotypical perception. Subsequent studies could focus on examining the ethical implications of the results and further expanding the research by increasing the sample size to validate statistical significance, as well as recruiting a more diverse sample size from various backgrounds and experiences. RESEARCH HIGHLIGHTS Many studies have indicated how gender affects human perceptions of a conversational agent. However, there is limited research on the effect of gender when applied to a chatbot system. This research study presents early results which indicate that chatbot gender does have an effect on users overall satisfaction and gender-stereotypical perception. Users are more likely to apply gender stereotypes when a chatbot system operates within a gender-stereotypical subject domain, such as mechanics, and when the chatbot gender does not conform to gender stereotypes. This study raise ethical issues. Should we exploit this result and perpetuate the bias and stereotyping? Should we really have a male chatbot for technical advice bots? Is this perpetuating stereotyping, the dilemma being that a male version would elicit more trust?
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.