Abstract

With the commercialization of ChatGPT, generative artificial intelligence (AI) has been applied almost everywhere in our lives. However, even though generative AI has become a daily technology that anyone can use, most non-majors need to know the process and reason for the results because it can be misused due to lack of sufficient knowledge and misunderstanding. Therefore, this study investigated users’ preferences for when, what, and how generative AI should provide explanations about the process of generating and the reasoning behind the results, using conjoint method and mixed logit analysis. The results show that users are most sensitive to the timing of providing eXplainable AI (XAI), and that users want additional information only when they ask for explanations during the process of using generative AI. The results of this study will help shape the XAI design of future generative AI from a user perspective and improve usability.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call