Abstract

In this paper, we extract the learning and test data for the "hand gesture of grasping" through the sEMG sensor, execute the Deep Learning CNN (convolutional neural network) algorithm by appropriately modifying it, and classify typical hand gestures that catch objects with a classification success rate (accuracy) of approximately 93.8%. In addition, we have constructed a system that can operate robot hands in real time from these classified commands to make active prosthetics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call