Abstract

Abstract In this paper, a muscle gesture computer Interface (MGCI) system for robot navigation Control employing a commercial wearable MYO gesture Control armband is proposed. the motion and gesture control device from Thalamic Labs. The software interface is developed using LabVIEW and Visual Studio C++. The hardware Interface between the Thalamic lab’s MYO armband and the robotic arm has been implemented using a National Instruments My RIO, which provides real time EMG data needed. This system allows the user to control a three Degrees of freedom robotic arm remotely by his/her Intuitive motion by Combining the real time Electromyography (EMG) signal and inertial measurement unit (IMU) signals. Computer simulations and experiments are developed to evaluate the feasibility of the proposed System. This system will allow a person to wear this/her armband and move his/her hand and the robotic arm will imitate the motion of his/her hand. The armband can pick up the EMG signals of the person’s hand muscles, which is a time varying noisy signal, and then process this MYO EMG signals using LabVIEW and make classification of this signal in order to evaluate the angles which are used as feedback to servo motors needed to move the robotic arm. A simulation study of the system showed very good results. Tests show that the robotic arm can imitates the arm motion at an acceptable rate and with very good accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.