Abstract

Affective computing is a rapidly developing area at the intersection of psychology and the development of artificial intelligence systems. At the moment, systems for recognising human emotions from photos and videos (facial expressions), voice recordings (intonation), gestures, posture, gait and other data using various machine learning algorithms are actively developed. This research project is devoted to the recognition of emotions and mental states of a person through the sensors of mobile devices (accelerometer, gyroscope, etc.), which record the features of micro- and macro-hand motor skills characteristic of the states under study. The pilot study considered the possibility of fixing and differentiating psychoemotional states according to the readings of sensors of mobile devices (tablet, smartphone), using machine learning models. As a result, it was possible to obtain models that determine, according to the readings of the sensors of a mobile device in the hands of a person, whether it is in a neutral emotional state or under stress. Moreover, it was possible to differentiate the state of stress according to two modalities – stress caused by psychological reasons ("fulfillment of obligations") and psychophysiological reasons (unpleasant noise in the headphones). The statistically significant differences, as well as the relatively high accuracy of the constructed machine learning model, allow us to speak about the reliability of the results obtained, and they confirm the hypothesis about the possibility of identifying and classifying emotional states using the sensors of mobile devices.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call