Abstract

With the rise of robotics and artificial intelligence, good communication between humans and machines becomes more important. However, users with language and hearing disadvantages may find synthetic speech systems to be difficult to understand. In this study, we explore the types of sentence structure and level of word complexity that affect intelligibility of speech in unfamiliar context. Using semantically unpredictable sentences, we found that sentence with more complex syntax such as relative pronouns and question words are harder to comprehend, while on the word level, it is the shorter and simpler words that contribute to misunderstandings. We found that although word frequency affects how well a word is recognised, the effect from the occurring frequency is much less than the effect of how phonetically distinctive the word is. There was also evidence of significant difference between native speakers and non-native speakers on how well they could understand the sentences. These results may help us in designing better dialogue system for machine to human interactions, especially in the healthcare arena, where often users have disadvantages in language and hearing abilities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call