Abstract

Recognizing applied hand forces using force myography (FMG) biosignals requires adequate training data to facilitate physical human-robot interactions (pHRI). But in practice, data is often scarce, and labels are usually unavailable or time consuming to generate. Synthesizing FMG biosignals can be a viable solution. Therefore, in this paper, we propose for the first time a dual-phased algorithm based on semi-supervised adversarial learning utilizing fewer labeled real FMG data with generated unlabeled synthetic FMG data. We conducted a pilot study to test this algorithm in estimating applied forces during interactions with a Kuka robot in 1D-X, Y, Z directions. Initially, an unsupervised FMG-based deep convolutional generative adversarial network (FMG-DCGAN) model was employed to generate real-like synthetic FMG data. A variety of transformation functions were used to observe domain randomization for increasing data variability and for representing authentic physiological, environmental changes. Cosine similarity score and generated-to-input-data ratio were used as decision criteria minimizing the reality gap between real and synthetic data and helped avoid risks associated with wrong predictions. Finally, the FMG-DCGAN model was pretrained to generate pseudo-labels for unlabeled real and synthetic data, further retrained using all labeled and pseudo-labeled data and was termed as the self-trained FMG-DCGAN model. Lastly, this model was evaluated on unseen real test data and achieved accuracies of 85&#x0025;&#x003E;R<sup>2</sup> &#x003E; 77&#x0025; in force estimation compared to the corresponding supervised baseline model (89&#x0025;&#x003E;R<sup>2</sup> &#x003E; 78&#x0025;). Therefore, the proposed method can be more practical for use in FMG-based HRI, rehabilitation, and prosthetic control for daily, repetitive usage even with few labeled data.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.