Abstract

Recent developments in the non-muscular human–robot interface (HRI) and shared control strategies have shown potential for controlling the assistive robotic arm by people with no residual movement or muscular activity in upper limbs. However, most non-muscular HRIs only produce discrete-valued commands, resulting in non-intuitive and less effective control of the dexterous assistive robotic arm. Furthermore, the user commands and the robot autonomy commands usually switch in the shared control strategies of such applications. This characteristic has been found to yield a reduced sense of agency as well as frustration for the user according to previous user studies. In this study, we firstly propose an intuitive and easy-to-learn-and-use hybrid HRI by combing the Brain–machine interface (BMI) and the gaze-tracking interface. For the proposed hybrid gaze-BMI, the continuous modulation of the movement speed via the motor intention occurs seamlessly and simultaneously to the unconstrained movement direction control with the gaze signals. We then propose a shared control paradigm that always combines user input and the autonomy with the dynamic combination regulation. The proposed hybrid gaze-BMI and shared control paradigm were validated for a robotic arm reaching task performed with healthy subjects. All the users were able to employ the hybrid gaze-BMI for moving the end-effector sequentially to reach the target across the horizontal plane while also avoiding collisions with obstacles. The shared control paradigm maintained as much volitional control as possible, while providing the assistance for the most difficult parts of the task. The presented semi-autonomous robotic system yielded continuous, smooth, and collision-free motion trajectories for the end effector approaching the target. Compared to a system without assistances from robot autonomy, it significantly reduces the rate of failure as well as the time and effort spent by the user to complete the tasks.

Highlights

  • Assistive robotic systems have demonstrated high potential in enabling people with upper limb physical disabilities, such as traumatic spinal cord injuries (SCI), amyotrophic lateral sclerosis (ALS), and tetraplegic patients, to achieve greater independence and thereby increase quality of life (Vogel et al, 2015; Beckerle et al, 2017; Muelling et al, 2017)

  • The experiments are performed by a number of able-bodied volunteers, and the results show that the new human–robot interfaces (HRI)-driven semi-autonomous assistive robotic system allows for a continuous, smooth, and collisionfree motion trajectory for the end-effector approaching the target, significantly reducing the rate of failure as well as time and effort spent by the user to complete the tasks

  • We propose a new control paradigm for the robotic arm reaching task, where the robot autonomy is dynamically blended with the gaze-Brain–machine interface (BMI) control from a user

Read more

Summary

Introduction

Assistive robotic systems have demonstrated high potential in enabling people with upper limb physical disabilities, such as traumatic spinal cord injuries (SCI), amyotrophic lateral sclerosis (ALS), and tetraplegic patients, to achieve greater independence and thereby increase quality of life (Vogel et al, 2015; Beckerle et al, 2017; Muelling et al, 2017). To produce the assistive robot control for severely impaired patients, the conventional manual control interfaces [computer mouse, keyboard, joystick, electromyography (EMG)-based interface, etc.] for able-bodied or mildly impaired people may no longer be applicable. This is because the aforementioned interfaces require that there are still some residual movements or muscle activities in the user. For people with no residual movement or muscular activity, previous studies have focused on two key aspects for facilitating the interaction between patients and the assistive robot. One is the design of human–robot interfaces (HRI). The other is the devising of human–robot coordination strategies tailored to the interface

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call