Abstract

Affective computing is important in human–computer interaction. Especially in interactive cloud computing within big data, affective modeling and analysis have extremely high complexity and uncertainty for emotional status as well as decreased computational accuracy. In this paper, an approach for affective experience evaluation in an interactive environment is presented to help enhance the significance of those findings. Based on a person-independent approach and the cooperative interaction as core factors, facial expression features and states as affective indicators are applied to do synergetic dependence evaluation and to construct a participant’s affective experience distribution map in interactive Big Data space. The resultant model from this methodology is potentially capable of analyzing the consistency between a participant’s inner emotional status and external facial expressions regardless of hidden emotions within interactive computing. Experiments are conducted to evaluate the rationality of the affective experience modeling approach outlined in this paper. The satisfactory results on real-time camera demonstrate an availability and validity comparable to the best results achieved through the facial expressions only from reality big data. It is suggested that the person-independent model with cooperative interaction and synergetic dependence evaluation has the characteristics to construct a participant’s affective experience distribution, and can accurately perform real-time analysis of affective experience consistency according to interactive big data. The affective experience distribution is considered as the most individual intelligent method for both an analysis model and affective computing, based on which we can further comprehend affective facial expression recognition and synthesis in interactive cloud computing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call