To address the problem of accurate assessment of training effects of flight simulators, an intelligent algorithm using neural networks and reinforcement learning is proposed in the paper. The multi-dimensional data of facial expression features and EEG and EM physiological signals are analyzed. A new evaluation model for assessing human spatial balance, attention distribution, neurological weakness, and other pilot training states is studied through facial expression experiments mainly and eye-movement and EEG experiments supplemented. The EEG acquisition and analysis during pilot training subjects (take-off and landing) are completed, and the emotional characteristics of pilots during training are identified. We completed data fusion of multi-dimensional channels, constructed mathematical models of pilot maneuver reaction time and attention allocation, monitored and evaluated flight training effects, and conducted controlled experiments. The experimental results show that the average recognition rates of 92.598% and 87.013% were achieved for expression and neurasthenia recognition, and the human ergonomic information of facial expression and EEG and EM were effectively fused.