Abstract
Previous investigations of temporal integration in fishes have presented conflicting results. Popper [J. Acoust. Soc. Am. 52, 596–602 (1972)] found no evidence for integration for signals between 10 and 500 ms in duration, while Offutt [J. Acoust. Soc. Am. 41, 13–19 (1967)] clearly did. Since one of the differences between these studies was the level of a noise background, we determined to measure the effect of noise on the detectability of tone pulses as a function of pulse duration. Three goldfish (Carassius auratus) were classically conditioned to detect tone pulses ranging from 2.5 to 100 ms in duration for repetition periods of 100 and 1000 msec under both ambient (unmasked) and broadband noise conditions. The threshold function of signal duration under noise showed nearly perfect integration (3 dB decrease in threshold per doubling of duration). The function under ambient conditions showed no consistent effects of signal duration, and thus no evidence for the type of integration expected from simple models [R. Plomp, J. Acoust. Soc. Am. 33, 1561–1569 (1961)]. We conclude that the presence of masking noise may account for the previous conflicting results, and that the nature of the neural processes used in signal detection by the goldfish are fundamentally different under noisy and quiet conditions. [Work supported by N.I.H. grants to R.R.F.]
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have