Abstract

Separate groups of pigeons were trained to perform symbolic delayed matching to sample with auditory and visual sample stimuli. For animals in the auditory group, ambient tones that varied in frequency served as sample stimuli; for animals in the visual group, ambient red and green lights served as sample stimuli. In both cases, the sample stimuli were mapped onto the yellow and blue comparison stimuli presented on left and right pecking keys. In Experiments 1 and 2, it was found that visual and auditory delayed matching were affected in the same ways by several temporal variables: delay, length of exposure to the sample stimulus, and intertrial interval. In Experiments 3, 4A, and 4B, a houselight presented during the delay interval strongly interfered with retention in both visual and auditory groups, but white noise presented during the delay had little effect in either group. These results seem to be more in line with a prospective memory model, in which visual and auditory sample stimuli are coded into the same instructional memories, than with a model based on concepts of retrospective memory and modality specificity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call