Abstract

Processing of multimodal information is essential for an organism to respond to environmental events. However, how multimodal integration in neurons translates into behavior is far from clear. Here, we investigate integration of biologically relevant visual and auditory information in the goldfish startle escape system in which paired Mauthner-cells (M-cells) initiate the behavior. Sound pips and visual looms as well as multimodal combinations of these stimuli were tested for their effectiveness of evoking the startle response. Results showed that adding a low intensity sound early during a visual loom (low visual effectiveness) produced a supralinear increase in startle responsiveness as compared to an increase expected from a linear summation of the two unimodal stimuli. In contrast, adding a sound pip late during the loom (high visual effectiveness) increased responsiveness consistent with a linear multimodal integration of the two stimuli. Together the results confirm the Inverse Effectiveness Principle (IEP) of multimodal integration proposed in other species. Given the well-established role of the M-cell as a multimodal integrator, these results suggest that IEP is computed in individual neurons that initiate vital behavioral decisions.

Highlights

  • Integration of sensory information from different modalities is essential for decision-making of appropriately timed behavioral responses

  • The Inverse Effectiveness Principle (IEP) predicts an inverse relationship between individual effectiveness of two unimodal stimuli presented alone and their combined effectiveness, i.e., multimodal integration of two weak stimuli will produce a response that is disproportionately larger than the response evoked by the integration of two strong stimuli. (Meredith and Stein, 1986; Stein et al, 2014)

  • To illustrate the range of response latencies evoked by auditory stimuli and visual stimuli we combined all responses for a given modality showing that auditory evoked startles occur within a narrow range of latencies (Figure 1C)

Read more

Summary

Introduction

Integration of sensory information from different modalities is essential for decision-making of appropriately timed behavioral responses. Neurons processing multimodal inputs are found throughout the CNS, prominently the cortical sensory processing areas and superior colliculus in mammals (Meredith et al, 1987; Wallace et al, 1998; Ghazanfar and Schroeder, 2006; King and Walker, 2012), and the optic tectum and hindbrain in birds, amphibians, and fish (Winkowski and Knudsen, 2006; Hiramoto and Cline, 2009; Mu et al, 2012; Medan et al, 2018). Multimodal integration depends on overlapping timing and/or spatial location of unimodal stimuli and typically results in an enhancement of the neural and behavioral response. Our goal was to study the IEP phenomenon in a downstream circuit where a distinct behavior can be directly related to sensorimotor neural processing

Objectives
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call