Abstract

With the development of artificial intelligence technology, voice-based intelligent systems (VISs), such as AI speakers and virtual assistants, are intervening in human life. VISs are emerging in a new way, called human–AI interaction, which is different from existing human–computer interaction. Using the Kansei engineering approach, we propose a method to evaluate user satisfaction during interaction between a VIS and a user-centered intelligent system. As a user satisfaction evaluation method, a VIS comprising four types of design parameters was developed. A total of 23 subjects were considered for interaction with the VIS, and user satisfaction was measured using Kansei words (KWs). The questionnaire scores collected through KWs were analyzed using exploratory factor analysis. ANOVA was used to analyze differences in emotion. On the “pleasurability” and “reliability” axes, it was confirmed that among the four design parameters, “sentence structure of the answer” and “number of trials to get the right answer for a question” affect the emotional satisfaction of users. Four satisfaction groups were derived according to the level of the design parameters. This study can be used as a reference for conducting an integrated emotional satisfaction assessment using emotional metrics such as biosignals and facial expressions.

Highlights

  • Artificial intelligence based on various technologies, such as machine learning, natural language processing, machine vision, and big data, has been applied to systems in various fields to be used as user agents or mutual cooperation models [1]

  • Interaction with AI-infused systems is defined as human–AI interaction (HAII) [3] and conceived as a modified version of human–computer interaction (HCI), which provides differentiated interaction [4]

  • The user satisfaction of HAII, which is different from the existing HCI, was evaluated using a voice-based intelligent systems (VISs)

Read more

Summary

Introduction

Artificial intelligence based on various technologies, such as machine learning, natural language processing, machine vision, and big data, has been applied to systems in various fields to be used as user agents or mutual cooperation models [1]. A representative intelligent system is a voice-based intelligent system (VIS) in the form of a chatbot, which is created from developed speech recognition technology, such as natural language processing and text-to-speech. Fierce competition is underway to preempt the market for agent technology using VIS in artificial intelligence speakers and smartphones [2]. Humans have performed system-driven interaction in HCI, whereas intelligent systems are required to provide user-centered interaction that adapts to a specified context of use in HAII. Purington et al [5] investigated the impact of satisfaction and personalization using user reviews of VIS integrated into real life, and Cárdenas et al [6]

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call