Abstract

The mechanisms underlying multi-sensory interactions are still poorly understood despite considerable progress made since the first neurophysiological recordings of multi-sensory neurons. While the majority of single-cell neurophysiology has been performed in anesthetized or passive-awake laboratory animals, the vast majority of behavioral data stems from studies with human subjects. Interpretation of neurophysiological data implicitly assumes that laboratory animals exhibit perceptual phenomena comparable or identical to those observed in human subjects. To explicitly test this underlying assumption, we here characterized how two rhesus macaques and four humans detect changes in intensity of auditory, visual, and audio-visual stimuli. These intensity changes consisted of a gradual envelope modulation for the sound, and a luminance step for the LED. Subjects had to detect any perceived intensity change as fast as possible. By comparing the monkeys' results with those obtained from the human subjects we found that (1) unimodal reaction times differed across modality, acoustic modulation frequency, and species, (2) the largest facilitation of reaction times with the audio-visual stimuli was observed when stimulus onset asynchronies were such that the unimodal reactions would occur at the same time (response, rather than physical synchrony), and (3) the largest audio-visual reaction-time facilitation was observed when unimodal auditory stimuli were difficult to detect, i.e., at slow unimodal reaction times. We conclude that despite marked unimodal heterogeneity, similar multisensory rules applied to both species. Single-cell neurophysiology in the rhesus macaque may therefore yield valuable insights into the mechanisms governing audio-visual integration that may be informative of the processes taking place in the human brain.

Highlights

  • The integration of multi-sensory information benefits object detection, localization, and response latency (e.g., Todd, 1912; Hershenson, 1962; Gielen et al, 1983; Stein and Meredith, 1993; Frens et al, 1995; Corneil and Munoz, 1996; Stein, 1998; Corneil et al, 2002; Van Wanrooij et al, 2009)

  • Similar to what we found for audio-evoked reaction time (RT) early infrared-sensor release rates, RRearly (RTs < 0 ms), in the visual paradigm were ∼16% and ∼20% for monkeys M1 and M2, respectively

  • We argued that facilitation should be inversely correlated with amplitude-modulation detection difficulty, as obtained from unimodal RT modes

Read more

Summary

Introduction

The integration of multi-sensory information benefits object detection, localization, and response latency (e.g., Todd, 1912; Hershenson, 1962; Gielen et al, 1983; Stein and Meredith, 1993; Frens et al, 1995; Corneil and Munoz, 1996; Stein, 1998; Corneil et al, 2002; Van Wanrooij et al, 2009). Insights in the neuronal mechanisms underlying these principles stem predominantly from single-unit recordings in anesthetized animals (e.g., Meredith and Stein, 1983; Stein and Wallace, 1996; Bizley et al, 2007; Rowland et al, 2007), from passive awake animals (Schroeder et al, 2001; Bell et al, 2005; Ghazanfar et al, 2005; Schroeder and Foxe, 2005; Kayser et al, 2008), lesion studies in cats (Stein et al, 1989; Burnett et al, 2004; Jiang et al, 2007; Rowland et al, 2014), and from a few combined behavioral and neurophysiological studies in cats (Peck, 1996; Jiang et al, 2002), and non-human primates (Frens and Van Opstal, 1998; Miller et al, 2001; Bell et al, 2005; Fetsch et al, 2011; Brosch et al, 2015; Plakke et al, 2015). In order to link animal neurophysiology and behavior with human psychophysics, it is necessary to directly compare perceptual performance of human and other animal subjects performing in identical tasks

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.