Situational Awareness (SA) assessment is of paramount importance in various domains, with particular significance in the military for safe aviation decision-making. It involves encompassing perception, comprehension, and projection levels in human beings. Accurate evaluation of SA statuses across these three levels is crucial for mitigating human false-positive and false-negative rates in monitoring complex scenarios in the aviation context. This study proposes a comprehensive comparative analysis by involving two types of physiological records: electroencephalogram (EEG) signals and brain electrical activity mapping (BEAM) images. These two modalities are leveraged to automate precise SA evaluation using both conventional machine learning and advanced deep learning techniques. Benchmarking experiments reveal that the BEAM-based deep learning models attain state-of-the-art performance scores of 0.955 for both SA perception and comprehension levels, respectively. Conversely, the EEG signals-based manual feature extraction, selection, and classification approach achieved a superior accuracy of 0.929 for the projection level of SA. These findings collectively highlight the potential of deploying diverse physiological records as valuable computational tools for enhancing SA evaluation throughout aviation decision-making safety.
Read full abstract