Abstract

Industrial human-robot collaboration (HRC) aims to combine human intelligence and robotic capability to achieve higher productiveness. In industrial HRC, the communication between humans and robots is essential to enhance the understanding of the intent of each other to make a more fluent collaboration. Brain-computer interface (BCI) is a technology that could record the user’s brain activity that can be translated into interaction messages (e.g., control commands) to the outside world, which can build a direct and efficient communication channel between human and robot. However, due to lacking information feedback mechanisms, it is challenging for BCI to control robots with a high degree of freedom with a limited number of classifiable mental states. To address this problem, this paper proposes a closed-loop BCI with contextual visual feedback by an augmented reality (AR) headset. In such BCI, the electroencephalogram (EEG) patterns from the multiple voluntary eye blinks are considered the input and its online detection algorithm is proposed whose average accuracy can reach 94.31%. Moreover, an AR-enable information feedback interface is designed to achieve an interactive robotic path planning. A case study of an industrial HRC assembly task is also developed to show that the proposed closed-up BCI could shorten the time of user input in human-robot interaction.

Highlights

  • In recent years, industrial human-robot collaboration (HRC), which is a promising production paradigm that enable human and robot work side-by-side to jointly complete same production goals, has enjoyed the more and more interest from industry and academia

  • It is still challenging for Brain-computer interface (BCI) to control a robot with high degree of freedom because of the limited number of mental states that could be classified in the existing BCIs which is unable to generate the highdimensional commands

  • We introduce augmented reality feedback to achieve a close-loop BCI to control the robot in industrial HRC scenario

Read more

Summary

Introduction

Industrial human-robot collaboration (HRC), which is a promising production paradigm that enable human and robot work side-by-side to jointly complete same production goals, has enjoyed the more and more interest from industry and academia. [21] proposed a Power Spectrum Density (PSD) thresholdbased method, where a moving window of PSD was compared to a threshold to detect eye blinks pattern These threshold-based approaches are of highly sensitivity to the feature selection and the current threshold, which could diff highly across users and hardware. [24] proposed an RBF network-based eye-blink feature extraction method with a segmentation of a fixed window of 1 second Such learning-based methods require user training and machine learning which needs a lot of labelled training data. The monitor displays the objects that robot could grasp for user, and the user can perform different movement imaginary to select the object to be grasped These researches are mainly focusing on the high-level control which are unable to specify the detailed configuration of the robot operation, like the path planning. In this paper, inspired by the AR-based robot path planning in [30,31,32], we propose an AR-enable robot path planning with the input from BCI and without any extra hand-held interaction tool and hand gesture inputs

System architecture
Multiple voluntary eye blink detection algorithm
Detect blinks of all kinds
10. Cluster and select the group with larger amplitude
Online detection approach
Control logic
Trajectory specification based on head motion capture
AR-based motion preview as feedback
Command handling
Performance of the voluntary eye blink detection algorithm
Case study of an industrial HRC assembly task
Findings
Conclusion and future work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call