Abstract
In the future, in a society where robots and humans live together, HRI is an important field of research. While most human–robot-interaction (HRI) studies focus on appearance and dialogue, touch-communication has not been the focus of many studies despite the importance of its role in human–human communication. This paper investigates how and where humans touch an inorganic non-zoomorphic robot arm. Based on these results, we install touch sensors on the robot arm and conduct experiments to collect data of users’ impressions towards the robot when touching it. Our results suggest two main things. First, the touch gestures were collected with two sensors, and the collected data can be analyzed using machine learning to classify the gestures. Second, communication between humans and robots using touch can improve the user’s impression of the robots.
Highlights
Before on the robot arm, we conducted an experiment to observe how and where people attempted installing the sensors on the robot arm, we conducted an experiment to observe how and to make contact with the robot arm used in this study
The experiment was conducted on where people attempted to make contact with the robot arm used in this study
We conducted an emotion transfer experiment and a touch gesture acquisition experiment on a robot arm equipped with a sensor
Summary
This work aims to improve the HRI by introducing communication by contact
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have