Abstract
The fast-growing human–robot collaboration predicts that a human operator could command a robot without mechanical interface if effective communication channels are established. In noisy, vibrating and light sensitive environments, some sensors for detecting the human intention could find critical issues to be adopted. On the contrary, biological signals, as electromyographic (EMG) signals, seem to be more effective. In order to command a laboratory collaborative robot powered by McKibben pneumatic muscles, promising actuators for human–robot collaboration due to their inherent compliance and safety features have been researched, a novel modeling-based electromyographic signal (MBES) classifier has been developed. It is based on one EMG sensor, a Myotrac one, an Arduino Uno and a proper code, developed in the Matlab environment, that performs the EMG signal recognition. The classifier can recognize the EMG signals generated by three hand-finger movements, regardless of the amplitude and time duration of the signal and the muscular effort, relying on three mathematical models: exponential, fractional and Gaussian. These mathematical models have been selected so that they are the best fitting with the EMG signal curves. Each of them can be assigned a consent signal for performing the wanted pick-and-place task by the robot. An experimental activity was carried out to test and achieve the best performance of the classifier. The validated classifier was applied for controlling three pressure levels of a McKibben-type pneumatic muscle. Encouraging results suggest that the developed classifier can be a valid command interface for robotic purposes.
Highlights
One trend of Industry 4.0 is advanced robotics [1]: it predicts a wide use of collaborative robots, the so-called cobots, whose spread is presently growing strongly [2]
This new paradigm avoids the replacement of humans by robots, encourages the human–robot collaboration (HRC) that foresees humans and robots working safely together and sharing the same workspace, and expands the use of cobots in non-industrial applications
It follows that HRC can release human operators from heavy and alienating tasks, ensuring their own safety, if effective communication channels between humans and robots are established
Summary
One trend of Industry 4.0 is advanced robotics [1]: it predicts a wide use of collaborative robots, the so-called cobots, whose spread is presently growing strongly [2]. Several approaches were defined to ensure the safety of humans, according to the specifications and guidelines defined in [4], they can carry some risk if human factors are not properly considered For this reason, on the one hand, the human operator must acquire and improve new skills for safety [5]; on the other hand, the human operator can have a new role in the interaction with robots whose control can occur without mechanical interfaces (i.e., button switches, touch pads, contact sensors) activated by the operator, but by systems that directly detect the human intention. It follows that HRC can release human operators from heavy and alienating tasks, ensuring their own safety, if effective communication channels between humans and robots are established
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.