Abstract

A lot of people have neuromuscular problems that affect their lives leading them to lose an important degree of autonomy in their daily activities. When their disabilities do not involve speech disorders, robotic wheelchairs with voice assistant technologies may provide appropriate human–robot interaction for them. Given the wide improvement and diffusion of Google Assistant, Apple’s Siri, Microsoft’s Cortana, Amazon’s Alexa, etc., such voice assistant technologies can be fully integrated and exploited in robotic wheelchairs to improve the quality of life of affected people. As such, in this paper, we propose an abstraction layer capable of providing appropriate human–robot interaction. It allows use of voice assistant tools that may trigger different kinds of applications for the interaction between the robot and the user. Furthermore, we propose a use case as a possible instance of the considered abstraction layer. Within the use case, we chose existing tools for each component of the proposed abstraction layer. For example, Google Assistant was employed as a voice assistant tool; its functions and APIs were leveraged for some of the applications we deployed. On top of the use case thus defined, we created several applications that we detail and discuss. The benefit of the resulting Human–Computer Interaction is therefore two-fold: on the one hand, the user may interact with any of the developed applications; on the other hand, the user can also rely on voice assistant tools to receive answers in the open domain when the statement of the user does not enable any of the applications of the robot. An evaluation of the presented instance was carried out using the Software Architecture Analysis Method, whereas the user experience was evaluated through ad-hoc questionnaires. Our proposed abstraction layer is general and can be instantiated on any robotic platform including robotic wheelchairs.

Highlights

  • People with neuromuscular problems tend to lose a significant degree of autonomy in their daily life

  • We propose an abstraction layer for Human–Robot Interaction (HRI) integrated with voice assistant technologies; Out of the proposed abstraction layer, we instantiated a use case; For the proposed use case, we show how to integrate voice assistant technologies with applications written for the used robotic platform; as such, we developed a Chess game by using Dialogflow [20]

  • We introduced an abstraction layer for HRI that includes voice assistant technologies and that can be plugged into any robotic platform

Read more

Summary

Introduction

People with neuromuscular problems tend to lose a significant degree of autonomy in their daily life. As each voice assistant makes available its own APIs for developers, it is possible to integrate their technology for several purposes and in different devices, robots included, bringing about new business opportunities in diverse areas [18]. They can be employed within robotic applications. In line with all the aforementioned issues, in this paper we introduce an abstraction layer for Human–Robot Interaction (HRI) that employs voice assistant technologies Such a design allows the robots to increase their interaction with the users by exploiting the power of recent voice assistant technologies.

Robotic Wheelchairs
Voice Assistant Tools
The Proposed Abstraction Layer
The Proposed Use Case
The Developed Applications
Bingo Game
Semantic Sentiment Analysis Application
Generative Conversational Agent Application
Robot Action Commands Application
Object Detection Application
Mr Chess Application
Use Case Architecture Evaluation
Identify Stakeholders
Identifying and Classifying Scenarios
Scenarios Evaluation and Interactions
Overall evaluation
User Experience
Findings
Conclusions and Future Works
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call