Abstract

We present a novel method for tracking arm movement and muscle activity simultaneously in real-time with application to prosthesis hand manipulation and multi-robot systems. The system utilises mechanomyography (MMG), a low frequency rumble produced by contracting muscle, to control grasping, and an inertial measurement unit (IMU) to change the hand gesture through specific arm orientations. Current gesture control systems rely on either robot vision or a user physical action, such as the pressing of a button or physical interaction with the robot, in order to change the hand gesture; both are not feasible for control in the field. Using custom made IMU and MMG sensors we combine information in order to both control a robotic hand and change its gesture remotely. The system has been tested on a prosthetic hand for grasp control, with results showing very quick user adoption to the method with minimal false commands. Furthermore, we compare our method with electromyography (EMG) methods and take into account the donning and doffing times required for both EMG and MMG sensors. Future work will involve the implementation of the system for a range of robotic control applications, including human augmentation and human control of multi-robot systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call