Abstract

Automated diagnostic aids can assist human operators in signal detection tasks, providing alarms, warnings, or diagnoses. Operators often use decision aids poorly, though, falling short of best possible performance levels. Previous research has suggested that operators interact with binary signal detection aids using a sluggish contingent cutoff (CC) strategy (Robinson & Sorkin, 1985), shifting their response criterion in the direction stipulated by the aid's diagnosis each trial but making adjustments that are smaller than optimal. The present study tested this model by examining the efficiency of automation-aided signal detection under different levels of task difficulty. In a pair of experiments, participants performed a numeric decision-making task requiring them to make signal or noise judgments on the basis of probabilistic readings. The mean reading values of signal and noise states differed between groups of participants, producing two levels of task difficulty. Data were fit with the CC model and two alternative accounts of automation-aided strategy: a discrete deference (DD) model, which assumed participants defer to the aid on a subset of trials and a mixture model, which assumed that participants choose randomly between the CC and DD strategies every trial. Model fits favored the mixture model. The results indicate multiple forms of inefficiency in operators' strategies for using signal detection aids. (PsycInfo Database Record (c) 2023 APA, all rights reserved).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call