Abstract

In a novel computer mouse tracking paradigm, participants read a spatial phrase such as “The blue item to the left of the red one” and then see a scene composed of 12 visual items. The task is to move the mouse cursor to the target item (here, blue), which requires perceptually grounding the spatial phrase. This entails visually identifying the reference item (here, red) and other relevant items through attentional selection. Response trajectories are attracted toward distractors that share the target color but match the spatial relation less well. Trajectories are also attracted toward items that share the reference color. A competing pair of items that match the specified colors but are in the inverse spatial relation increases attraction over-additively compared to individual items. Trajectories are also influenced by the spatial term itself. While the distractor effect resembles deviation toward potential targets in previous studies, the reference effect suggests that the relevance of the reference item for the relational task, not its role as a potential target, was critical. This account is supported by the strengthened effect of a competing pair. We conclude, therefore, that the attraction effects in the mouse trajectories reflect the neural processes that operate on sensorimotor representations to solve the relational task. The paradigm thus provides an experimental window through motor behavior into higher cognitive function and the evolution of activation in modal substrates, a longstanding topic in the area of embodied cognition.

Highlights

  • Most everyday tasks are a seamless combination of perception, cognition, and action

  • The target and the distractor were viewed as potential movement goals that must be disambiguated through grounding the spatial phrase

  • The effect was observed to comparable degrees when the target position relative to the reference item was specified by horizontal axis spatial terms (“left of” or “right of”) and when it was specified by vertical axis spatial terms (“above” or “below”)

Read more

Summary

Introduction

Most everyday tasks are a seamless combination of perception, cognition, and action. To pick a snack at a selfservice bakery, I have to recognize the different varieties of pastry on the counter, decide which one I like best, and reach for it. When I am rushed, I might start reaching before I know which pastry exactly I will pick, deciding as I go, and in effect my hand may follow a Institut fur Neuroinformatik, Ruhr-Universitat Bochum, Universitatsstraße 150, 44801 Bochum, Germany less-than-straight path as donuts are weighed against nearby croissants (Truong et al, 2013). In line with this intuition, psychological researchers increasingly agree that the neural processes underlying perception, cognition, and action are closely interlinked (e.g., Pezzulo & Cisek 2016) and evolve in a graded and temporally continuous manner, rather than being strictly separable into sequential stages. An important source of support for this view comes from behavioral experiments in which motoric responses are influenced in a graded way by properties of perceptual or cognitive components of the task (Spivey, 2007)

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call