Abstract

In investigations of emotional expression, schematic faces are attractive stimuli because they can be precisely controlled. However, these stimuli are then often treated as comparable to real faces, which could be problematic. Previous research has shown that schematic faces are processed less holistically than naturalistic faces (Prazak, E.R., & Burgund, E.D., 2014) and that the deficits autistic patients have in emotion detection may not extend to faces (Rosset, D.B., Rondan, C., Da Fonseca, D., Santos, A., Assouline, B., & Deruelle, C., 2008). These examples suggest that in some cases, faces are processed differently. We hypothesized that higher levels of schematization increase communicability of emotion. Here we tested participants in a discrimination task with emotional faces presented with varying degrees of schematization: in addition to unmanipulated greyscale photos, we applied rotoscoping software to generate two heavy outlined cartoon versions of the same photographs with high vs. low contrast. To vary featural complexity, simple faces underwent the same treatment to create two schematic stimulus types. The resulting stimulus set contained five types of faces - three photo-based and two schematic - which non-linearly spanned a range from photos to simple cartoons. In each trial, participants were presented with a rapidly presented face (17, 33, 50, and 66 ms) and instructed to identify which emotion was present. At 17 ms, expressions in photos were detected above chance. However, for every stimulus type moving away from photos towards cartoons, accuracy increased. That is, at shorter presentation times, as contrast increased and featural complexity decreased, discrimination became more successful. These results suggest that, as faces are represented less realistically, their informational content is more easily accessed. Moreover, our data also suggest that both contrast and featural complexity influence how easy it is to detect emotions in an image. Meeting abstract presented at VSS 2015.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.