Abstract

A novel breed of robots known as socially assistive robots is emerging. These robots are capable of providing assistance to individuals through social and cognitive interaction. However, there are a number of research issues that need to be addressed in order to design such robots. In this paper, we address one main challenge in the development of intelligent socially assistive robots: The robot’s ability to identify human non-verbal communication during assistive interactions. In particular, we present a unique non-contact and non-restricting automated sensor-based approach for identification and categorization of human upper body language in determining how accessible a person is to the robot during natural real-time human-robot interaction (HRI). This classification will allow a robot to effectively determine its own reactive task-driven behavior during assistive interactions. Human body language is an important aspect of communicative nonverbal behavior. Body pose and position can play a vital role in conveying human intent, moods, attitudes and affect. Preliminary experiments show the potential of integrating the proposed body language recognition and classification technique into socially assistive robotic systems partaking in HRI scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call