Abstract

Social distancing and remote work are becoming more prevalent in the post-covid world. At the same time, there is a huge demand for remote healthcare sessions as well. Although a growing number of such sessions are now utilizing online platforms as a medium of communication, other critical parameters such as the affective state and other feedback opportunities are lost during the transmission of this digital information. This paper presents a solution that leverages a brain-computer interface system for this affective feedback and a humanoid robot for teaching effectively during remote sessions. The solution uses Kinect as a sensing mechanism for the trainer. It utilizes state-of-the-art deep learning algorithms at the back-end to understand the emotional state of the trainee. The training poses (from humanoid’s camera feed and kinect) are calculated using AlphaPose compared using inverse kinematics. To ascertain the trainees’ state (high valence and arousal vs. low valence and arousal), a Capsule Network was used that gives an average accuracy of 90.4% for this classification with a low average inference time of 14.3ms on the publicly available DREAMER and AMIGOS datasets. The system also allows real-time communication through the humanoid, making this experience even more distinct for the trainee.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call