To explore and validate effective eye movement features related to motion sickness (MS) through closed-track experiments and to provide valuable insights for practical applications. With the development of autonomous vehicles (AVs), MS has attracted more and more attention. Eye movements have great potential to evaluate the severity of MS as an objective quantitative indicator of vestibular function. Eye movement signals can be easily and noninvasively collected using a camera, which will not cause discomfort or disturbance to passengers, thus making it highly applicable. Eye movement data were collected from 72 participants susceptible to MS in closed-track driving environments. We extracted features including blink rate (BR), total number of fixations (TNF), total duration of fixations (TDF), mean duration of fixations (MDF), saccade amplitude (SA), saccade duration (SD), and number of nystagmus (NN). The statistical method and multivariate long short-term memory fully convolutional network (MLSTM-FCN) were used to validate the effectiveness of eye movement features. Significant differences were shown in the extracted eye movement features across different levels of MS through statistical analysis. The MLSTM-FCN model achieved an accuracy of 91.37% for MS detection and 88.51% for prediction in binary classification. For ternary classification, it achieved an accuracy of 80.54% for MS detection and 80.11% for prediction. Evaluating MS through eye movements is effective. The MLSTM-FCN model based on eye movements can efficiently detect and predict MS. This work can be used to provide a possible indication and early warning for MS.
Read full abstract