Abstract
The representation of spatial information related to an event can influence behavior even when location is task-irrelevant, as in the case of Stimulus-Response (S-R) compatibility effects on the Simon task. However, unlike single-modality situations, which are often used to study the Simon effect, in real-life scenarios various sensory modalities provide spatial information coded in different coordinate systems. Here, we address the expression of S-R compatibility effects in mixed-modality contexts, where events can occur in 1 of various sensory modalities (i.e., vision, touch or audition). The results confirm that, in single-modality cases, Simon effects in vision are expressed in an external spatial frame of reference, while touch information is coded anatomically. Remarkably, when mixing visual and tactile trials in an unpredictable way, the Simon effect disappeared in vision whereas tactile Simon effects remained expressed in their own (anatomical) frame of reference. Mixing visual and auditory stimuli did not obliterate the visual Simon effect and S-R compatibility effects in an external reference frame were evident for both modalities. The extinction of visual Simon effects as a result of mixing visual and tactile modalities can be interpreted as a consequence of the dynamic reorganization of the weights associated to the different sources of spatial information at play. (PsycINFO Database Record
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.