Abstract

Adaptive music, evolving nonlinearly with in-game situations, has swiftly progressed since the advent of audio middleware in the 1990s. As a pivotal component alongside visuals, it enhances player experiences, adding depth to narratives and gameplay, fostering immersion. This study analyzes adaptive music generation methods, proposing effective techniques using 'The Legend of Zelda: Tears of the Kingdom' as a case study. Field music changes in different game areas were analyzed based on in-game time and player character location. Results showed that 'Tears of the Kingdom' employs methods of randomly rearranging harmonious audio elements and introducing arbitrary playback timing differences. These resource-efficient methods can be effectively applied on storage-constrained platforms like portable gaming devices or mobile platforms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call