Abstract

A central component of search and rescue missions is the visual search of survivors.
In large parts, this depends on human operators and is, therefore, subject to the
constraints of human cognition, such as mental fatigue. This makes detecting mental
fatigue a critical step to be implemented in future systems. However, to the best of
our knowledge, it has seldom been evaluated using a realistic visual search task. In
addition, an accuracy discrepancy exists between studies that use time-on-task (TOT)
- the popular method- and performance metrics for labels. Yet, to our knowledge, they
have never been directly compared. This study was designed to address both issues:
the use of a realistic task to elicit mental fatigue during a monotonous visual search
task and the labelling type used for intra-participant fatigue estimation. Over four
blocks of 15 minutes, participants had to identify targets on a computer while their
cardiac, cerebral (EEG), and eye-movement activities were recorded. The recorded
data were then fed into several physiological computing pipelines. The results show
that the capability of a machine learning algorithm to detect mental fatigue depends
less on the input data but rather on how mental fatigue is defined. Using TOT, very
high classification accuracies are obtained (e.g. 99.3%). On the other hand, if mental
fatigue is estimated based on behavioural performance, a metric with a much greater
operational value, classification accuracies return to chance level (i.e. 52.2%). TOTbased mental fatigue estimation is popular, and strong classification accuracies can be
achieved with a multitude of sensors. These factors contribute to the popularity of
this method, but both usability and the relation to the concept of mental fatigue are
neglected.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.