Abstract

Powered prosthetic hands capable of executing various grasp patterns are highly sought-after solutions for upper limb amputees. A crucial requirement for such prosthetic hands is the accurate identification of the intended grasp pattern and subsequent activation of the prosthetic digits accordingly. Vision-based grasp classification techniques offer improved coordination between amputees and prosthetic hands without physical contact. Deep learning methods, particularly Convolutional Neural Networks (CNNs), are utilized to process visual information for classification. The key challenge lies in developing a model that can effectively generalize across various object shapes and accurately classify grasp classes. To address this, a compact CNN model named GraspCNet is proposed, specifically designed for grasp classification in prosthetic hands. The use of separable convolutions reduces the computational burden, making it potentially suitable for real-time applications on embedded systems. The GraspCNet model is designed to learn and generalize from object shapes, allowing it to effectively classify unseen objects beyond those included in the training dataset. The proposed model was trained and tested using various standard object data sets. A cross-validation strategy has been adopted to perform better in seen and unseen object class scenarios. The average accuracy achieved was 82.22% and 75.48% in the case of seen, and unseen object classes respectively. In computer-based real-time experiments, the GraspCNet model achieved an accuracy of 69%. A comparative analysis with state-of-the-art techniques revealed that the proposed GraspCNet model outperformed most benchmark techniques and demonstrated comparable performance with the DcnnGrasp method. The compact nature of the GraspCNet model suggests its potential for integration with other sensing modalities in prosthetic hands.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call