Abstract

Musical co-creativity aims at making humans and computers collaborate to compose music. As an MIR team in computational musicology, we experimented with co-creativity when writing our entry to the “AI Song Contest 2020”. Artificial intelligence was used to generate the song’s structure, harmony, lyrics, and hook melody independently and as a basis for human composition. It was a challenge from both the creative and the technical point of view: in a very short time-frame, the team had to adapt its own simple models, or experiment with existing ones, to a related yet still unfamiliar task, music generation through AI. The song we propose is called “I Keep Counting”. We openly detail the process of songwriting, arrangement, and production. This experience raised many questions on the relationship between creativity and machine, both in music analysis and generation, and on the role AI could play to assist a composer in their work. We experimented with <em>AI as automation</em>, mechanizing some parts of the composition, and especially <em>AI as suggestion</em> to foster the composer’s creativity, thanks to surprising lyrics, uncommon successions of sections and unexpected chord progressions. Working with this material was thus a stimulus for human creativity.

Highlights

  • Music creation experiments involving an artificial system are as old as the idea of computing

  • In 1955-56, several projects generated notated musical content with computers, including a program to apply the compositional combinatorial rules that make the 18th-century dice game attributed to Mozart, “Musikalisches Würfenspiel”, the song “Push Button Bertha” by Klein and Bolith written through Monte Carlo sampling from rules (Ariza, 2011), and the famous “Illiac Suite” generated with Markov chains by Hiller Jr and Isaacson (1957)

  • In the co-creative experiment described in this paper, we adopt this approach for the melody, and extend it to the generation of additional musical layers, namely chord sequences, lyrics, and global structure

Read more

Summary

Keep Counting

The notes played in the piano track in every section result in a human-made voicing of the generated chord sequences (Section 3.2) with occasional additional non-chord notes. At the end of the song (Bridge 2 and Chorus 3), this piano track is duplicated with an arpeggiator MIDI effect and accompanied by an additional pad track, rendered with the Pad VSTi, which plays the same notes. The African Kalimba VSTi was used to render the intro/outro hook As mentioned earlier, it was generated almost at the end of the work (D-9) and at this time, the team had an increasingly personal and precise idea of the targeted final song, selecting a sound that fits to the song. Playing a background role from our point of view, and being human-composed, the team tried to keep the piano/pad, string and bass tracks as discreet as possible. The final mix and the mastering was done by a professional sound engineer

Introduction
Structure Dataset
D-11 D-9 D-7 D-3 D0
Chords Dataset
Lyrics Dataset
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.