Abstract

Playing music is about conveying emotions and the lighting at a concert can help do that. However, new and unknown bands that play at smaller venues and bands that don’t have the budget to hire a dedicated light technician have to miss out on lighting that will help them to convey the emotions of what they play. In this paper it is investigated whether it is possible or not to develop an intelligent system that through a multimodal input detects the intended emotions of the played music and in real-time adjusts the lighting accordingly. A concept for such an intelligent lighting system is developed and described. Through existing research on music and emotion, as well as on musicians’ body movements related to the emotion they want to convey, a row of cues is defined. This includes amount, speed, fluency and regularity for the visual and level, tempo, articulation and timbre for the auditory. Using a microphone and a Kinect camera to detect such cues, the system is able to detect the intended emotion of what is being played. Specific lighting designs are then developed to support the specific emotions and the system is able to change between and alter the lighting design based on the incoming cues. The results suggest that the intelligent emotion-based lighting system has an advantage over a just beat synced lighting and it is concluded that there is reason to explore this idea further.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call