Abstract

Objects are reminiscent of actions often performed with them: knife and apple remind us on peeling the apple or cutting it. Mnemonic representations of object-related actions (action codes) evoked by the sight of an object may constrain and hence facilitate recognition of unrolling actions. The present fMRI study investigated if and how action codes influence brain activation during action observation. The average number of action codes (NAC) of 51 sets of objects was rated by a group of n = 24 participants. In an fMRI study, different volunteers were asked to recognize actions performed with the same objects presented in short videos. To disentangle areas reflecting the storage of action codes from those exploiting them, we showed object-compatible and object-incompatible (pantomime) actions. Areas storing action codes were considered to positively co-vary with NAC in both object-compatible and object-incompatible action; due to its role in tool-related tasks, we here hypothesized left anterior inferior parietal cortex (aIPL). In contrast, areas exploiting action codes were expected to show this correlation only in object-compatible but not incompatible action, as only object-compatible actions match one of the active action codes. For this interaction, we hypothesized ventrolateral premotor cortex (PMv) to join aIPL due to its role in biasing competition in IPL. We found left anterior intraparietal sulcus (IPS) and left posterior middle temporal gyrus (pMTG) to co-vary with NAC. In addition to these areas, action codes increased activity in object-compatible action in bilateral PMv, right IPS, and lateral occipital cortex (LO). Findings suggest that during action observation, the brain derives possible actions from perceived objects, and uses this information to shape action recognition. In particular, the number of expectable actions quantifies the activity level at PMv, IPL, and pMTG, but only PMv reflects their biased competition while observed action unfolds.

Highlights

  • Observed action entails a highly complex stimulus that prompts a multitude of attentional and memory processes

  • Regarding our hypothesis on areas housing action codes (H1) we focused on anterior inferior parietal cortex (aIPL) because of converging findings from various studies reporting left aIPL to be engaged in the representation of pragmatic properties of objects, manipulation knowledge (e.g., Chao and Martin, 2000; Kellenbach et al, 2003; Johnson-Frey, 2004; Rumiati et al, 2004; Boronat et al, 2005; Ishibashi et al, 2011)

  • We used fMRI in an action observation paradigm to test whether left aIPL codes for action codes, i.e., whether its activation level varies as a function of the currently evoked number of action codes

Read more

Summary

Introduction

Observed action entails a highly complex stimulus that prompts a multitude of attentional and memory processes. The observer has to be flexible with regard to potential actions that may unroll, but yet quickly discard those which do not pertain to the actual situation. When considering object-related action, the observer has access to at least two sources of information that usually help him to quickly recognize the most probable action goal: manipulation movements and objects. These two basic sources of information, rather than being complementary, are intimately interrelated: familiar objects such as mobile phones or knifes are strongly reminiscent of manipulations that we perform with them everyday. When seeing someone handling a knife and an apple, the object set “knife, apple” evokes two action codes: “cutting apple with knife” and “peeling apple with knife.” While tracking the unfolding manipulation, we at a point in time notice that the peeling-action code matches the observed manipulation, and recognize the actor is peeling the apple with the knife (object function), probably to prepare it for eating (goal)

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call