This study tested a model of how animals discriminate the relative numerosity of stimuli in successive or sequential presentation tasks. In a discrete-trials procedure, pigeons were shown one light for nf times and then another for nl times. Next they received food for choosing the light that had occurred the least-number of times during the sample. At issue were (a) how performance varies with the interval between the two stimulus sets (the interblock interval) and the interval between the end of the sample and the beginning of the choice period (the retention interval); and (b) whether a simple mathematical model of the discrimination process could account for the data. The model assumed that the influence of a stimulus on choice increases linearly when the stimulus is presented, but decays exponentially when the stimulus is absent; choice probability is given by the ratio of the influence values of the two stimuli. The model also assumed that as the retention interval elapses there is an increasing probability that the ongoing discriminative process be disrupted and then the animal responds randomly. Results showed that increasing the interblock intervals reduced the probability of choosing the last stimulus of the sample as the least-frequent one. Increasing the retention interval reduced accuracy without inducing any stimulus bias. The model accounted well for the major trends in the data.
Read full abstract