Abstract
Object recognition is a complex cognitive process that relies on how the brain organizes object-related information. While spatial principles have been extensively studied, less studied temporal dynamics may also offer valuable insights into this process, particularly when neural processing overlaps for different categories, as it is the case of the categories of hands and tools. Here we focus on the differences and/or similarities between the time-courses of hand and tool processing under electroencephalography (EEG). Using multivariate pattern analysis, we compared, for different time points, classification accuracy for images of hands or tools when compared to images of animals. We show that for particular time intervals (~ 136–156 ms and ~ 252–328 ms), classification accuracy for hands and for tools differs. Furthermore, we show that classifiers trained to differentiate between tools and animals generalize their learning to classification of hand stimuli between ~ 260–320 ms and ~ 376–500 ms after stimulus onset. Classifiers trained to distinguish between hands and animals, on the other hand, were able to extend their learning to the classification of tools at ~ 150 ms. These findings suggest variations in semantic features and domain-specific differences between the two categories, with later-stage similarities potentially related to shared action processing for hands and tools.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.