Abstract

Robot-assisted rehabilitation systems are developed to monitor the performance of the patients and adapt the rehabilitation task intensity and difficulty level accordingly to meet the needs of the patients. The robot-assisted rehabilitation systems can be more prosperous if they are able to recognize the emotions of patients, and modify the difficulty level of task considering these emotions to increase patient's engagement. We aim to develop an emotion recognition model using electroencephalography (EEG) and physiological signals (blood volume pulse (BVP), skin temperature (ST) and skin conductance (SC)) for a robot-assisted rehabilitation system. The emotions are grouped into three categories, which are positive (pleasant), negative (unpleasant) or neutral. A machine-learning algorithm called Gradient Boosting Machines (GBM) and a deep learning algorithm called Convolutional Neural Networks (CNN) are used to classify pleasant, unpleasant and neutral emotions from the recorded EEG and physiological signals. We ask the subjects to look at pleasant, unpleasant and neutral images from IAPS database and collect EEG and physiological signals during the experiments. The classification accuracies are compared for both GBM and CNN methods when only one sensory data (EEG, BVP, SC and ST) or the combination of the sensory data from both EEG and physiological signals are used.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call