The detectability of tones, or of intensity increments to tones, in bands of random noise was measured for conditions in which the overall level was fixed or was randomly roved from interval to interval of every experimental trial. The purpose of the within-trial rove was to limit the usefulness of a detection strategy based on overall level or level within a single "critical band." At "supracritical" bandwidths, the functions relating masked threshold to noise bandwidth for the roved conditions were similar to those obtained when no rove was employed. At "subcritical" bandwidths, thresholds were higher in some roved conditions, but, for the largest rove, were still lower than would be predicted from arguments based purely on level detection--with one exception. A comparison of observer performance relative to the statistical limits imposed by the roving-level procedure indicated that the traditional critical-band energy-detector model could not account for the results, which are attributed to discrimination based on spectral shape or on waveshape.
Read full abstract