Abstract

Emotion-aware services are increasingly used in different applications such as gaming, mental health tracking, video conferencing, and online tutoring. The core of such services is usually a machine learning model that automatically infers its user's emotions based on different biological indicators (e.g., physiological signals and facial expressions). However, such machine learning models often require a large number of emotion annotations or ground truth labels, which are typically collected as manual self-reports by conducting long-term user studies, commonly known as Experience Sampling Method (ESM). Responding to repetitive ESM probes for self-reports is time-consuming and fatigue-inducing. The burden of repetitive self-report collection leads to users responding arbitrarily or dropping out from the studies, compromising the model performance. To counter this issue, we, in this paper, propose a Human-AI Collaborative Emotion self-report collection framework, HACE, that reduces the self-report collection effort significantly. HACE encompasses an active learner, bootstrapped with a few emotion self-reports (as seed samples), and enables the learner to query for only not-so-confident instances to retrain the learner to predict the emotion self-reports more efficiently. We evaluated the framework in a smartphone keyboard-based emotion self-report collection scenario by performing a 3-week in-the-wild study (N = 32). The evaluation of HACE on this dataset (≈11,000 typing sessions corresponding to more than 200 hours of typing data) demonstrates that it requires 46% fewer self-reports than the baselines to train the emotion self-report detection model and yet outperforms the baselines with an average self-report detection F-score of 85%. These findings demonstrate the possibility of adopting such a human-AI collaborative approach to reduce emotion self-report collection efforts.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.