Abstract

In two separate experiments, the latencies associated with all four categories of response (correct detections, false alarms, correct rejections, and omissions) were recorded during the performance of a 45-min. visual monitoring task. In the first experiment, concerned primarily with criterion changes in vigilance, signal probability was manipulated. The second experiment was concerned with sensitivity changes resulting from changes in event rate. In the first experiment, latencies associated with correct detections and false alarms increased, whereas those associated with correct rejections and omission errors decreased, with an increase in criterion. In the second experiment, a reduction in sensitivity associated with an increased event rate exerted significant and opposing effects on latencies of responses to signals (correct detections and omissions) while leaving the latencies of responses to nonsignals (correct rejections and false alarms) unchanged. In both experiments, it was observed that while the latencies associated with positive responses increased with time on task, the latencies of negative responses decreased with time. These results are consistent with the predictions of a decision theory model for response latency extended from signal detection theory, which assumes an inverse relation between response latency and distance from the criterion; A decision theory analysis thus enables the interpretation of both detectability and latency measures of vigilance performance within the same theoretical framework.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call