Abstract

Purpose Dyadic interactions are significant for human life. Most body sensor networks-based research studies focus on daily actions, but few works have been done to recognize affective actions during interactions. The purpose of this paper is to analyze and recognize affective actions collected from dyadic interactions. Design/methodology/approach A framework that combines hidden Markov models (HMMs) and k-nearest neighbor (kNN) using Fisher kernel learning is presented in this paper. Furthermore, different features are considered according to the interaction situations (positive situation and negative situation). Findings Three experiments are conducted in this paper. Experimental results demonstrate that the proposed Fisher kernel learning-based framework outperforms methods using Fisher kernel-based approach, using only HMMs and kNN. Practical implications The research may help to facilitate nonverbal communication. Moreover, it is important to equip social robots and animated agents with affective communication abilities. Originality/value The presented framework may gain strengths from both generative and discriminative models. Further, different features are considered based on the interaction situations.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.