Abstract

A recent study (Guzman-Martinez et al., 2012) showed that participants match the frequency of an amplitude-modulated auditory stimulus to visual spatial frequency with a linear relationship and suggested this crossmodal mapping automatically guided attention to specific spatial frequencies. We replicated the reported matching relationship and also performed matching between tactile and visual spatial frequency. We then used the visual search paradigm to investigate whether auditory or tactile cues can guide attention to matched visual spatial frequencies. Participants were presented with a search display containing multiple Gabors, all with different spatial frequencies. When the auditory or tactile cue was informative, improved search efficiency occurred for some matched spatial frequencies, with the specificity of the effect being greater for touch than audition. However, when uninformative neither auditory and tactile cues produced any effect on visual search performance. Furthermore, when informative, unmatched auditory cues (shifted substantially from the reported match, but still matched in relative position) improved search performance. Taken together, these findings suggest that although auditory and tactile cues can influence visual selection of a matched spatial frequency, the effects are due to top-down attentional control rather than automatic attentional capture derived from low-level mapping.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.