Abstract
We present a novel method for tracking arm movement and muscle activity simultaneously in real-time with application to prosthesis hand manipulation and multi-robot systems. The system utilises mechanomyography (MMG), a low frequency rumble produced by contracting muscle, to control grasping, and an inertial measurement unit (IMU) to change the hand gesture through specific arm orientations. Current gesture control systems rely on either robot vision or a user physical action, such as the pressing of a button or physical interaction with the robot, in order to change the hand gesture; both are not feasible for control in the field. Using custom made IMU and MMG sensors we combine information in order to both control a robotic hand and change its gesture remotely. The system has been tested on a prosthetic hand for grasp control, with results showing very quick user adoption to the method with minimal false commands. Furthermore, we compare our method with electromyography (EMG) methods and take into account the donning and doffing times required for both EMG and MMG sensors. Future work will involve the implementation of the system for a range of robotic control applications, including human augmentation and human control of multi-robot systems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.