Abstract
The experiment reported here used a variation of the spatial cueing task to examine the effects of unimodal and bimodal attention-orienting primes on target identification latencies and eye gaze movements. The primes were a nonspatial auditory tone and words known to drive attention consistent with the dominant writing and reading direction, as well as introducing a semantic, temporal bias (past-future) on the horizontal dimension. As expected, past-related (visual) word primes gave rise to shorter response latencies on the left hemifield and future-related words on the right. This congruency effect was differentiated by an asymmetric performance on the right space following future words and driven by the left-to-right trajectory of scanning habits that facilitated search times and eye gaze movements to lateralized targets. Auditory tone prime alone acted as an alarm signal, boosting visual search and reducing response latencies. Bimodal priming, i.e., temporal visual words paired with the auditory tone, impaired performance by delaying visual attention and response times relative to the unimodal visual word condition. We conclude that bimodal primes were no more effective in capturing participants' spatial attention than the unimodal auditory and visual primes. Their contribution to the literature on multisensory integration is discussed.
Highlights
This research was designed to examine how distinct stimuli, namely nonspatial auditory tones and visual temporal words, can modulate visual attention when presented concurrently
The current study examines whether bimodal primes composed of a visual word and an auditory tone facilitate attention and target discrimination over their single unimodal presentation
In a spatial cueing task, we examined the effects of unimodal and bimodal primes on lateralized target identification
Summary
This research was designed to examine how distinct stimuli, namely nonspatial auditory tones and visual temporal words, can modulate visual attention when presented concurrently. Other research has shown that auditory tones temporally colocalized with visual targets drastically reduce visual search latencies (Dalton and Spence, 2007; Van der Burg et al, 2008; Vroomen and De Gelder, 2000). The current study examines whether bimodal primes composed of a visual word and an auditory tone facilitate attention and target discrimination over their single unimodal presentation. We used a bimodal cueing task with time words priming horizontal locations consistent with the semantic indication (past-left/future-right in ‘Western’ languages, Lakens et al, 2011) and biased by the culturally-defined reading and writing direction (Suitner and Maass, 2016), and a spatially neutral auditory prime previously shown to accelerate visual search towards the location of a synchronized visual event (Ngo and Spence, 2010; Vroomen and De Gelder, 2000)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have