Abstract

In interactive systems, knowing the user's emotional state is not only important to understand and improve overall user experience, but also of the utmost relevance in scenarios where such information might foster our ability to help users manage and express their emotions (e.g., anxiety), with a strong impact on their daily life and on how they interact with others. Nevertheless, although there is a clear potential for emotionally-aware applications, several challenges preclude their wider availability, sometimes resulting from the low translational nature of the research in affective computing methods, and from a lack of straightforward methods for easy integration of emotion in applications. In light of these challenges, we propose a conceptual vision for the consideration of emotion in the scope of multimodal interactive systems, and how it can articulate with research in affective computing. Aligned with this vision, a first instantiation of an affective generic modality is presented, and a proof-of-concept application, enabling multimodal interaction with Spotify, illustrates how the modality can provide emotional context in interactive scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call