Abstract

The emotional activities of the human body are mainly regulated by the autonomic nervous system, the central nervous system, and the advanced cognition of the human brain. This paper proposes an emotional state recognition method in pilot training tasks based on multimodal information fusion. A set of emotion perception recognition systems, a two-dimensional valence-arousal emotional model, and a multimodal information intelligent perception model were established based on the human-computer interaction mode during flight training; an intelligent perception system was designed to collect and intelligently perceive four kinds of peripheral physiological signals of pilots in real-time. And based on traditional machine learning models, a binary tree support vector machine was designed to optimize and improve the multimodal information co-integration decision model, which increased the accuracy of emotional state recognition in flight training by 37.58% on average. The experimental result showed that it realizes accurate monitoring and identification of real-time emotional state, helps to improve the effect of flight training and flight safety, maintains the efficiency of operation, and has important research significance and application prospects in the field of pilot training.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call