Abstract
Abstract It has long been understood that the ventral visual stream of the human brain processes features of simulated human faces. Recently, specificity for real and interactive faces has been reported in lateral and dorsal visual streams, raising new questions regarding neural coding of interactive faces and lateral and dorsal face-processing mechanisms. We compare neural activity during two live interactive face-to-face conditions where facial features and tasks remain constant while the social contexts (in-person or on-line conditions) are varied. Current models of face processing do not predict differences in these two conditions as features do not vary. However, behavioral eye-tracking measures showed longer visual dwell times on the real face and also increased arousal as indicated by pupil diameters for the real face condition. Consistent with the behavioral findings, signal increases with functional near infrared spectroscopy, fNIRS, were observed in dorsal-parietal regions for the real faces and increased cross-brain synchrony was also found within these dorsal-parietal regions for the real In-person Face condition. Simultaneously, acquired electroencephalography, EEG, also showed increased theta power in real conditions. These neural and behavioral differences highlight the importance of natural, in-person, paradigms and social context for understanding live and interactive face processing in humans.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.