Forty-eight subjects detected a long-duration (1.7 or 1.3 sec.) change in brightness (from a 5 ft.-l. standard to a 4 ft.-l. level) of an electroluminescent panel during a 60-min. monitoring session. Signal/nonsignal ratios (1/9 or 1/1) and payoffs (lax, neutral, or strict) were combined factorially in a between-subject design. Signal ratios affected both the percent of signal detections and the percent of false-alarm errors. When subjects monitored under the lower signal ratios, a decrease in percent of signal detections occurred over time. Payoffs affected only the percent of false alarms in the higher signal rate conditions. Signal detection theory analyses resulted in a slight decrease in d' and a marked increase in β during the watch period. The change in β was due primarily to the lower signal ratio conditions. Payoffs had no effect on subsequent β change. It was concluded that signal ratios rather than payoffs play the major role in determining decision performance in simple visual monitoring tasks.
Read full abstract