Much of our understanding of how the brain processes dynamic faces comes from research that compares static photographs to dynamic morphs, which exhibit simplified, computer-generated motion. By comparing static, video recorded, and dynamic morphed expressions, we aim to identify the neural correlates of naturalistic facial dynamism, using time-domain and time-frequency analysis. Dynamic morphs were made from the neutral and peak frames of video recorded transitions of happy and fearful expressions, which retained expression change and removed asynchronous and non-linear features of naturalistic facial motion. We found that dynamic morphs elicited increased N400 amplitudes and lower LPP amplitudes compared to other stimulus types. Video recordings elicited higher LPP amplitudes and greater frontal delta activity compared to other stimuli. Thematic analysis of participant interviews using a large language model revealed that participants found it difficult to assess the genuineness of morphed expressions, and easier to analyse the genuineness of happy compared to fearful expressions. Our findings suggest that animating real faces with artificial motion may violate expectations (N400) and reduce the social salience (LPP) of dynamic morphs. Results also suggest that delta oscillations in the frontal region may be involved with the perception of naturalistic facial motion in happy and fearful expressions. Overall, our findings highlight the sensitivity of neural mechanisms required for face perception to subtle changes in facial motion characteristics, which has important implications for neuroimaging research using faces with simplified motion.
Read full abstract