Abstract

Rats discriminated auditory intensity differences of sinusoids at 3.0 kilohertz in a go/no-go signal detection procedure. Responses to the signal (hits) were reinforced with electrical brain stimulation, and misses produced a brief timeout. On intermixed noise trials, withholding of responses (correct rejections) was reinforced, and false alarms produced the time-out. In two test conditions, the signal was either the louder (100 decibels) or softer (90, 93, 96, or 99 decibels) of the pair of intensities presented within a set of trials. Each animal was first trained with signal value louder or softer, and reversed for the second condition so that the former noise value served as signal. Hits showed shorter latencies than false alarms, regardless of the relative intensity of signal and noise, and the magnitude of differentiation was proportional to signal-noise separation. Both hits and false alarms showed longer latencies as the discrimination became more difficult. Isosensitivity contours derived from the latencies showed close similarity across conditions; in comparison, the yes-no measure of detectability, d', showed greater variability. The similarity of latency differentiation across louder and softer signal conditions supports a detection model in which the observer's judgment is controlled by the distance of sensory effect from criterion on each trial, as opposed to the loudness of the tones per se.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call