Cookies on this website
We use cookies to ensure that we give you the best experience on our website. If you click 'Continue' we'll assume that you are happy to receive all cookies and you won't see this message again. Click 'Find out more' for information on how to change your cookie settings.

As video games have grown from crude and simple circuit-based artefacts to a multibillion dollar worldwide industry, video-game music has become increasingly adaptive. Composers have had to use new techniques to avoid the traditional, event-based approach where music is composed mostly of looped audio tracks, which can lead to music that is too repetitive. In addition, these cannot scale well in the design of today's games, which have become increasingly complex and nonlinear in narrative. This paper outlines the use of experience-driven procedural music generation, to outline possible ways forward in the dynamic generation of music and audio according to user gameplay metrics. © 2009-2012 IEEE.

Original publication

DOI

10.1109/TCIAIG.2012.2212899

Type

Journal article

Journal

IEEE Transactions on Computational Intelligence and AI in Games

Publication Date

26/09/2012

Volume

4

Pages

192 - 198