Explainer: interactive composition

Breakdown was an interactive audiovisual dance performance at Ears Eyes and Feet in May this year, Austin Texas. Rodrigo Carvalho, Yago de Quay, Sunny Shen, CC BY-NC

Computers have shaken up how we create and listen to music, and there is more radical transformation to come. The potential for digital music to be interactive is leading to new ways to experience and to produce it. We are witnessing a trend towards “musified” interaction, where everyday experiences are embellished with interactive music.

Take The Listening Machine, an installation at London’s Barbican Centre last year. It used a running snapshot of mood on Twitter to drive the movement of a composition made out of a set of carefully crafted, recombinable segments recorded by the Britten Sinfonia.

Similarly, a recent Volkswagen concept video asks us to imagine being able to dynamically control a piece of music while driving. This time the music is the electronic rhythms of 90s techno pioneers Underworld:

Volkswagen Golf GTI + Underworld collaborated on a project that lets you create music through your driving.

The turns of the steering wheel run up and down a musical scale and the music’s intensity is modulated by the car’s speed. These works both use music that is partly pre-composed, and the design of the interaction is itself part of that composition, melding composition with interaction design. There are no instruments, but you still play the music as part of the experience; they are interactive compositions.

This is a natural step in the growing reach of digital design into physical environments, but its ancestry also lies in videogame music.

If you were a certain type of arty computer geek in 1983 you might have been excited to discover the videogame Moondust. The game’s music was generated in real time, note by note, making melodies in response to your character’s actions.

As an esoteric artwork, this crude-looking and sounding game was not quite a “killer demo”, but it spoke of a tantalising future. Some time later, English musician Brian Eno’s work in generative music gave more of a public profile to this imagined future of ever-changing music, from which our descendants would look back on our era of fixed recordings as a brief phase.

This vision is slowly playing out. Consider the booming new techno scene called Algorave, where artists write “live code” on stage, generating (surprisingly funky) mathematical musical patterns.

Algorave at Club Fierce, Birmingham. Antonio Roberts

A growing number of experimental artists are using interactive apps as a medium for composition – championed in the mainstream by experimentalists such as Bjørk and Radiohead.

While the album still dominates our collective conscious awareness, there are musics, old as well as new, that naturally resonate with this new capacity for digital dynamism. English “change ringing” (the art of ringing a set of tuned bells in a series of mathematical patterns), aspects of Indian classical music, and free jazz improvisation all emphasise the dynamism of process over the fixity of a recording.

The process of composing a piece of music to be interacted with is increasingly common, and videogames remain one of the most obvious contexts where this work is done. Often the technology behind videogame music is less sophisticated than you’d imagine, working with simple looped segments that mix in and out, following the gameplay in sometimes jarring transitions. But this is changing.

Boston-based videogame composer Ben Houge exemplifies the art of composing not only the raw elements of the music, but the detailed form of each transition. He builds software mock-ups of his compositions as he works on them, then collaborates with the game developers to incorporate the music into the game.

Ben Houge talks about a side project to his videogame work: pairing music and food – food opera.

The tools that can achieve this are becoming increasingly available and powerful. Australian company Firelight Technologies makes audio production software that plugs straight into popular game engines, allowing access to the game experience while working with sound.

Composers and sound designers can audition their work by playing the game, or play back recorded chunks of game action, letting them work in a more traditional linear way. The difference, of course, is that unlike regular composition, you are not just composing one piece of music, but an infinite space of combinations that need to be explored and carefully woven together.

As elsewhere, indie developers are tirelessly exploring the possibilities. Videogame designer/developer Ed Kelly and composer David Kanaga’s recent game Proteus is a sublime immersion into a strange (and otherwise goalless) low-res wonderworld where the environment and the music melt together in a synesthetic fusion, described by one reviewer as “the best song I’ve ever played”.

Proteus trailer.

Likewise, in Jeppe Carlseon, Jakob Schmidt and Niels Fyrst’s 140, the game’s smooth geometric transformations are perfectly beat-synced to an ongoing electronic dance music anthem.

Film and game music are a testament to how natural it is for music to be there without actually being there.

There is nothing unusual about the idea that music should complete the backdrop to all kinds of experiences, including interactions with gadgets, buildings and street furniture.

Here is a story that might buck the trend for the music maker, an opportunity for creators to bring life to technologies of interactive experience, rather than be disrupted by the new forms of distribution.