Artificial Intelligence (AI), computer simulations, and virtual reality (VR) are increasingly becoming accessible tools that can be leveraged to implement training protocols and educational resources. Typical assessment tools related to sensory and neural processing associated with task performance in virtual environments often rely on self-reported surveys, unlike electroencephalography (EEG), which is often used to compare the effects of different types of sensory feedback (e.g., auditory, visual, and haptic) in simulation environments in an objective manner. However, it can be challenging to know which aspects of the EEG signal represent the impact of different types of sensory feedback on neural processing. Machine learning approaches offer a promising direction for identifying EEG signal features that differentiate the impact of different types of sensory feedback during simulation training. For the current study, machine learning techniques were applied to differentiate neural circuitry associated with haptic and non-haptic feedback in a simulated drilling task. Nine EEG channels were selected and analyzed, extracting different time-domain, frequency-domain, and nonlinear features, where 360 features were tested (40 features per channel). A feature selection stage identified the most relevant features, including the Hurst exponent of 13-21 Hz, kurtosis of 21-30 Hz, power spectral density of 21-30 Hz, variance of 21-30 Hz, and spectral entropy of 13-21 Hz. Using those five features, trials with haptic feedback were correctly identified from those without haptic feedback with an accuracy exceeding 90%, increasing to 99% when using 10 features. These results show promise for the future application of machine learning approaches to predict the impact of haptic feedback on neural processing during VR protocols involving drilling tasks, which can inform future applications of VR and simulation for occupational skill acquisition.
Read full abstract