Abstract

The ability to automatically assess emotional responses via contact-free video recording taps into a rapidly growing market aimed at predicting consumer choices. If consumer attention and engagement are measurable in a reliable and accessible manner, relevant marketing decisions could be informed by objective data. Although significant advances have been made in automatic affect recognition, several practical and theoretical issues remain largely unresolved. These concern the lack of cross-system validation, a historical emphasis of posed over spontaneous expressions, as well as more fundamental issues regarding the weak association between subjective experience and facial expressions. To address these limitations, the present paper argues that extant commercial and free facial expression classifiers should be rigorously validated in cross-system research. Furthermore, academics and practitioners must better leverage fine-grained emotional response dynamics, with stronger emphasis on understanding naturally occurring spontaneous expressions, and in naturalistic choice settings. We posit that applied consumer research might be better situated to examine facial behavior in socio-emotional contexts rather than decontextualized, laboratory studies, and highlight how AHAA can be successfully employed in this context. Also, facial activity should be considered less as a single outcome variable, and more as a starting point for further analyses. Implications of this approach and potential obstacles that need to be overcome are discussed within the context of consumer research.

Highlights

  • Emotions matter profoundly for understanding consumers’ behavior in fast changing economic markets of modern life (McStay, 2016)

  • The current paper aims to critically discuss the growing role of automatic human affect analysis (AHAA) in consumer research

  • We argue that automatic classification may provide substantial new leverage to the study of emotion and cognition in consumer neuroscience through both primary and subsequent machine analysis

Read more

Summary

INTRODUCTION

Emotions matter profoundly for understanding consumers’ behavior in fast changing economic markets of modern life (McStay, 2016). It remains largely unknown how rich socio-emotional knowledge about the context of dynamic expressions shapes their perception (Maringer et al, 2011) Such applied questions are of imminent relevance for consumer research given that AHAA can provide per-frame classifications of large amounts of video data of human observers. While promising methods for analyzing spontaneous behavior have been proposed, fewer efforts target the automatic analysis of spontaneous displays (Masip et al, 2014) This could be due to the rather limited number of available databases with naturalistic and spontaneous expressions used to train and test machine classifiers. AHAA of spontaneous expressions may contribute to increasingly better predictions of real-world consumer responses while minimizing the burden on ethical data collection in the field Such an approach would provide a benchmark for comparisons between the different algorithms. These pre-processed facial activity data can itself be used as input features for machine learning methods to learn and predict human emotional behavior in context

CONCLUSION
AUTHOR CONTRIBUTIONS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call