Abstract

ABSTRACT The stress level of high-speed rail (HSR) train drivers directly impacts their job performance and thus the safety of HSR operations. This paper attempts to develop a quantitative understanding of train drivers’ stress levels and the contributing factors by the experimental study conducted in a realistic HSR simulator. An extensive statistical analysis found that the ultra-short-term heart rate variability metrics could differentiate different stress levels. Three different machine-learning classifiers were evaluated for stress detection, including support vector machine (SVM), random forests (RF), and K-nearest neighbour (KNN). The RF model was shown to perform the best in terms of robustness and classification accuracy. Moreover, the research found that the driver’s stress level should be detected rather than the type of stressor. The findings from this research could contribute to the development of real-time HSR driver condition monitoring systems and the improvement of current HSR operation safety regulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call