Music Creation and Emotional Content (MCEC)

A team from Digital Creativity Labs and the Department of Music at the University of York has developed a prototype that predicts emotional expressions in music as it is being created.

Focused on practical research into generative interactive music, the project has developed a system that models the emotional response of an audience in real time.

The work will support composers and producers working across a range of storytelling mediums, including the production of soundtracks for films and video games.

Researchers built the model using information gathered during music creation and perception experiments with participants asked to rate how they felt when listening to music created by composers.

The next phase of the study will gather feedback from composers about how they might work with the tool. The prototype model will also be released online.

With the aim of increasing productivity and impact, rather than creating music or taking away the role of the musicians, the model could eventually be developed so that it plugs into music production software.

Watch this space for more information about this exciting project coming in the new year!