Abstract
Vehicle “faces” are a crucial factor influencing consumer intention to purchase gasoline and electric vehicles. However, little empirical evidence has demonstrated whether people process a vehicle’s face similarly to a human’s face. We investigated the neural processing relationship among human facial emotions and facial emotions of gasoline and electric vehicles using a 2 (emotional) × 3 (face type) repeated measures design and electroencephalograph (EEG) recordings. The results showed that human faces appear to share a partly similar neural processing mechanism in the latency of 100–300 ms, and that both human and vehicle faces elicited the ERP components N170, EPN, and P2. The large EPN and P2 suggest that gasoline vehicle facial emotions can be perceived more efficiently than those of electric vehicles. These findings provide an insight for vehicle designers to better understand the facial emotions presented by cars.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.