Abstract

Extensive research has been conducted in human head pose detection systems and several applications have been identified to deploy such systems. Deep learning based head pose detection is one such method which has been studied for several decades and reports high success rates during implementation. Across several pet robots designed and developed for various needs, there is a complete absence of wearable pet robots and head pose detection models in wearable pet robots. Designing a wearable pet robot capable of head pose detection can provide more opportunities for research and development of such systems. In this paper, we present a novel head pose detection system for a wearable parrot-inspired pet robot using images taken from the wearer’s shoulder. This is the first time head pose detection has been studied in wearable robots and using images from a side angle. In this study, we used AlexNet convolutional neural network architecture trained on the images from the database for the head pose detection system. The system was tested with 250 images and resulted in an accuracy of 94.4% across five head poses, namely left, left intermediate, straight, right, and right intermediate.

Highlights

  • The field of robotics has found numerous applications in the recent years

  • This paper presented a novel head pose detection system for a bio-inspired wearable pet robot, KiliRo, to classify five head poses from the side angle of the wearer

  • The presented deep learning based head pose detection system reported an overall accuracy of 94.4%

Read more

Summary

Introduction

The field of robotics has found numerous applications in the recent years. Earlier, the predominant use of robots was mainly in performing tedious and repetitive tasks, such as manufacturing and transporting. With this human-centric HRI system, the robot can perform different actions for the same gesture, according to the user recognized through the face recognition system Several other robots, such as ROBITA, Robonaut, and Leonardo use gestures to interact with humans [8]. A pet robot that we have developed has been used to reduce stress levels of patients [34], improve learning abilities of children [35], and entertain participants [36] With such multi-industry applicability, pet robots with head pose detection can further improve closeness with humans and create effective human–robot interaction models. We present the design and development of a wearable parrot-inspired pet robot, KiliRo, and its human–robot interaction model using vision-based head pose detection. We provide the design of human-robot interaction model for wearable pet robots using a vision-based head pose detection method.

Robot Architecture
Learning
Database Generation
Training and Classification
Pooling
Results
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call