Abstract

A right-hand preference for visually-guided grasping has been shown on numerous accounts. Grasping an object requires the integration of both visual and motor components of visuomotor processing. It has been suggested that the left hemisphere plays an integral role in visuomotor functions. The present study serves to investigate whether the visual processing of graspable objects, without any actual reaching or grasping movements, yields a right-hand (left-hemisphere) advantage. Further, we aim to address whether such an advantage is automatically evoked by motor affordances. Two groups of right-handed participants were asked to categorize objects presented on a computer monitor by responding on a keypad. The first group was asked to categorize visual stimuli as graspable (e.g. apple) or non-graspable (e.g. car). A second group categorized the same stimuli but as nature-made (e.g. apple) or man-made (e.g. car). Reaction times were measured in response to the visually presented stimuli. Results showed a right-hand advantage for graspable objects only when participants were asked to respond to the graspable/non-graspable categorization. When participants were asked to categorize objects as nature-made or man-made, a right-hand advantage for graspable objects did not emerge. The results suggest that motor affordances may not always be automatic and might require conscious representations that are appropriate for object interaction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.