Abstract

Most neuroimaging experiments that investigate how tools and their actions are represented in the brain use visual paradigms where tools or hands are displayed as 2D images and no real movements are performed. These studies discovered selective visual responses in occipitotemporal and parietal cortices for viewing pictures of hands or tools, which are assumed to reflect action processing, but this has rarely been directly investigated. Here, we examined the responses of independently visually defined category-selective brain areas when participants grasped 3D tools (N = 20; 9 females). Using real-action fMRI and multivoxel pattern analysis, we found that grasp typicality representations (i.e., whether a tool is grasped appropriately for use) were decodable from hand-selective areas in occipitotemporal and parietal cortices, but not from tool-, object-, or body-selective areas, even if partially overlapping. Importantly, these effects were exclusive for actions with tools, but not for biomechanically matched actions with control nontools. In addition, grasp typicality decoding was significantly higher in hand than tool-selective parietal regions. Notably, grasp typicality representations were automatically evoked even when there was no requirement for tool use and participants were naive to object category (tool vs nontools). Finding a specificity for typical tool grasping in hand-selective, rather than tool-selective, regions challenges the long-standing assumption that activation for viewing tool images reflects sensorimotor processing linked to tool manipulation. Instead, our results show that typicality representations for tool grasping are automatically evoked in visual regions specialized for representing the human hand, the primary tool of the brain for interacting with the world.

Highlights

  • The emergence of handheld tools marks the beginning of a major discontinuity between humans and our closest primate relatives (Ambrose, 2001)

  • We found the first evidence that hand-selective cortex represents whether a 3D tool is being grasped appropriately by its handle

  • The same effects were not observed in tool, object, or body-selective areas, even when these areas overlapped with hand-selective voxels in IPS and LOTC

Read more

Summary

Introduction

The emergence of handheld tools (e.g., a spoon) marks the beginning of a major discontinuity between humans and our closest primate relatives (Ambrose, 2001). A highly replicable functional imaging finding is that viewing tool pictures activates sensorimotor brain areas (Lewis, 2006), but what drives this functional selectivity? We would never grasp a picture of a tool, and, more importantly, finding spatially overlapping activation between two tasks does not directly imply that the same neural representations are being triggered (Dinstein et al, 2008; Martin, 2016). Intraparietal activation for viewing tool pictures versus grasping shows poor correspondence (Valyear et al, 2007; Gallivan et al, 2013), questioning the long-standing assumption that visual tool selectivity represents sensorimotor aspects of manipulation

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.