Abstract

Social event segmentation, or parsing of the ongoing dynamic content into discrete social events, is thought to represent a mechanism that supports the expert human ability to navigate complex social environments. Here, we examined whether this ability is influenced by the temporal coherence of the context and by different sources of perceptual information. To do so, we created two video clips, one in which several situations unfolded in a contextually consistent manner, and the other in which the order of these situations was scrambled using a random sequence. Participants viewed each clip and were asked to mark social and nonsocial events in counterbalanced blocks of trials. We analyzed key-press behaviour as well as visual and auditory signals within the clips. Results showed that participants agreed on similar social and nonsocial events regardless of context availability, with greater agreement for social relative to nonsocial events. Context, however, modulated the reliance on sources of perceptual information, such that visual and auditory information was used differently when context was unavailable. Together, these data show that contextual coherence does not determine social event segmentation but serves a modulatory role in perceivers' reliance on perceptual sources of information when identifying events in complex social environments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call