Abstract

This paper focuses on how to provide mobility to people with motor impairments with the integration of robotics and wearable computing systems. The burden of learning to control powered mobility devices should not fall entirely on the people with disabilities. Instead, the system should be able to learn the user's movements. This requires learning the degrees of freedom of user movement, and mapping these degrees of freedom onto electric-powered wheelchair (EPW) controls. Such mapping cannot be static because in some cases users will eventually improve with practice. Our goal in this paper is to present a hands-free interface (HFI) that can be customized to the varying needs of EPW users with appropriate mapping between the users' degrees of freedom and EPW controls. EPW users with different impairment types must learn how to operate a wheelchair with their residual body motions. EPW interfaces are often customized to fit their needs. An HFI utilizes the signals generated by the user's voluntary shoulder and elbow movements and translates them into an EPW control scheme. We examine the correlation of kinematics that occur during moderately paced repetitive elbow and shoulder movements for a range of motion. The output of upper-limb movements (shoulder and elbows) was tested on six participants, and compared with an output of a precision position tracking (PPT) optical system for validation. We find strong correlations between the HFI signal counts and PPT optical system during different upper-limb movements (ranged from r = 0.86 to 0.94). We also tested the HFI performance in driving the EPW in a virtual reality environment on a spinal-cord-injured (SCI) patient. The results showed that the HFI was able to adapt and translate the residual mobility of the SCI patient into efficient control commands within a week's training. The results are encouraging for the development of more efficient HFIs, especially for wheelchair users.

Highlights

  • The advent of robotics technology in the last two decades has been a revolutionary development

  • To design performance-based adaptive robotic interfaces, one needs to analyse the signals generated as a result of user activity while performing a control task

  • It was revealed that most of the variation in TaHblFeI 2s.igMneaalsn RcoMuSlEdx, pCloarirneeladtiobny(rt)haendfirDsitffeprreinncciepianl NucmombeprosnoefnPte. aPkesafkordeelbteocwtio(Enlbw.)aasndpesrhfoourmldeedr (Sohnldtrh.)eafnigrslet s elbow flexion-extension angle estimates compared to precision position tracking (PPT) estimates. (b) hands-free interface (HFI) shoulder flexion-extension angle estimates compared to PPT estimates. (c) Comparison of root mean squared errors (RMSE) values for elbow flexion-extension between the HFI

Read more

Summary

Introduction

The advent of robotics technology in the last two decades has been a revolutionary development. Robotic interface technology has been deployed in automobiles to detect drivers’ state of mind by extracting facial, gesture and voice patterns [1]. Robotic interface controls are usually divided into four main categories [20]: assistive, challenging, haptic and noncontact-motivating These robotic interface paradigms are static, such that they do not adapt to the parameters of controllers based upon the performance measures of the users. Adaptation of control parameters for the robotic interface has the advantage of autonomous tuning of assistance to the changing needs of the users. To design performance-based adaptive robotic interfaces, one needs to analyse the signals generated as a result of user activity while performing a control task. Analysis of robotic interface signals usually describes the way the user handles complicated tasks whilst controlling the robot-assisted medical/rehabilitation device. In a recent study [25] the signals generated by a joystick user interface were assessed

Sensor-embedded HFI
Data Recording
Kinematics of Human Arm
Results
Correlation of the HFI and PPT optical system
Adduction
Participant
Experimental Set-up
EPW Control Scheme
Future Work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call