Abstract

Several computer systems have been designed for music emotion research that aim to identify how different structural or expressive cues of music influence the emotions conveyed by the music. However, most systems either operate offline by pre-rendering different variations of the music or operate in real-time but focus mostly on structural cues. We present a new interactive system called EmoteControl, which allows users to make changes to both structural and expressive cues (tempo, pitch, dynamics, articulation, brightness, and mode) of music in real-time. The purpose is to allow scholars to probe a variety of cues of emotional expression from non-expert participants who are unable to articulate or perform their expression of music in other ways. The benefits of the interactive system are particularly important in this topic as it offers a massive parameter space of emotion cues and levels for each emotion which is challenging to exhaustively explore without a dynamic system. A brief overview of previous work is given, followed by a detailed explanation of EmoteControl’s interface design and structure. A portable version of the system is also described, and specifications for the music inputted in the system are outlined. Several use-cases of the interface are discussed, and a formal interface evaluation study is reported. Results suggested that the elements controlling the cues were easy to use and understood by the users. The majority of users were satisfied with the way the system allowed them to express different emotions in music and found it a useful tool for research.

Highlights

  • This paper will focus on music and emotion research, in particular, musical cues and their effect on emotional expression research

  • An overview of previous research in musical cues and emotions will be offered, ranging from the traditional experiment methodologies to the computer systems designed for this research field, and highlighting some shortcomings in previous research that we aim to redress with our system

  • All participants agreed that the interface is extremely easy to get accustomed to, and the terms utilised as labels for the different features were clear, with nine participants selecting the highest ‘extremely clear’, and the remaining three participants, who were all non-musicians, selecting the second highest ‘somewhat

Read more

Summary

Introduction

This paper will focus on music and emotion research, in particular, musical cues and their effect on emotional expression research. In an interactive paradigm where the user is allowed to control the cue levels in real-time, this constraint is largely absent. Such interface is challenging to design and implement in a way that would satisfy users. We propose a new real-time interactive system called EmoteControl which taps into a direct user experience and allows users to control emotional expression in music via a selection of musical cues. The main contribution of EmoteControl is that it is a tool that can be utilised by non-experts, allowing users without any musical skill to be able to change cues of music to convey different emotions in real-time

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call