Abstract

The automatic recognition of multiple affective states can be enhanced if the underpinning computational models explicitly consider the interactions between the states. This work proposes a computational model that incorporates the dependencies between four states (tiredness, anxiety, pain, and engagement)known to appear in virtual rehabilitation sessions of post-stroke patients, to improve the automatic recognition of the patients' states. A dataset of five stroke patients which includes their fingers' pressure (PRE), hand movements (MOV)and facial expressions (FAE)during ten sessions of virtual rehabilitation was used. Our computational proposal uses the Semi-Naive Bayesian classifier (SNBC)as base classifier in a multiresolution approach to create a multimodal model with the three sensors (PRE, MOV, and FAE)with late fusion using SNBC (FSNB classifier). There is a FSNB classifier for each state, and they are linked in a circular classifier chain (CCC)to exploit the dependency relationships between the states. Results of CCC are over 90% of ROC AUC for the four states. Relationships of mutual exclusion between engagement and all the other states and some co-occurrences between pain and anxiety for the five patients were detected. Virtual rehabilitation platforms that incorporate the automatic recognition of multiple patient's states could leverage intelligent and empathic interactions to promote adherence to rehabilitation exercises.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call