Generative music is a term coined by Brian Eno years ago to describe music generated by a system, that is ever changing and varied, and that follows some rules. Ambient music – whose founding father is Eno – is “music created to fill a given space, that can either be listened to carefully or as easily ignored”.
“Up to 20 years ago or maybe more, I got interested in those processes that can produce music without a specific purpose” Eno asserts. “The first example of this concept are the Aeolian bells. If you build a set of eolian bells, you can determine the packaging in which music can ‘happen’, but you can’t precisely define the way in which music will be programmed through time. It’s only a way to produce music not entirely defined.” – Brian Eno
I’ve recently had the opportunity to interview the musician and software designer Peter Chilvers, co-founder of the Burning Shed record label. He has composed various generative music soundtracks for the Creatures games (released by Gameware Development) and he has put into reality Brian Eno’s ideas, by creating together with him three iPhone and iPod Touch applications. These new “programmes” (Bloom, Air and Trope) are set between disk and audiovisual, between album and “mobile” installation, all of them collected in such a widespread device, that has widened the territory won by Apple in the last years.
“These generative systems depend a lot from the connections of the observer/listener”, Eno said. “In my art there’s no synchronization of audio tracks and lights, even if people thinks there is. Synchronization is inside them.” – Brian Eno.
![]() |
.
After a climb with iPod and iTunes Music Store, the greatest music store on the net, the revolution Multi-Touch in mobile computing has recently started in the consumer’s sector, as well as in the professional one: new devices are emerging just to determine again the perception experience with new stimuli of synaesthetic kind with the introduction of more and more sophisticated sensors to reach a visual, audio and tactile overlap.
“There are three alternatives : live, recorded and generative music. Generative music enjoys by both its precursors. It is always different, as live music is. As for recorded music, it is free from space and time limits: you can listen to it anywhere when you want. And it also keeps another asset from recorded music: it can be composed through empirical methods. With this I mean that you can listen to it as long as it is being produced: it isn’t influenced by the long process that is proper to composed-written-and-played-music.” – Brian Eno
Eno has chosen the Apple platform to distribute his “generative software”, and it is a development environment that is being used even in his installation 77 Million Paintings, where the artist explores sound and light to reach further aesthetic possibilities: the title comes in fact form the number of possible combinations of video and music that can be generated by a software.
“One of the main aspects of these images is the fact that there’s no beginning and no ending ,”says Eno. “I want them to give the sensation of neverending.
Peter Chilvers explains the idea that stands behind this artistic product as well as he wants to explain the approach to the use of technology that allows users to customise their audio experience: even if Eno and Chilvers quit control on final result, they provide us with highly defined audio samples, but left undone, and they give us the possibility to reproduce them as well as to manipulate them to create an endlessly varied audio programme.
![]() |
.
Matteo Milani: Thanks for your time. Peter Chilvers as a musician first: few words about ‘A Marble Calm’ project.
Peter Chilvers:I happened across the phrase ‘A Marble Calm’ on holiday a few years ago, thought it sounded like an interesting band name, then started thinking about the type of band that might be. The more I thought about it, the more it seemed to tie up a number of ideas that were interesting to me: drifting textural ambient pieces, improvisation and song. By making it a loose collective, it’s enabled me to bring in other vocalists and musicians I’ve enjoyed working with on other projects – vocalists Sandra O’Neill (who also worked with me on ‘Air’ for the iPhone) and Tim Bowness, marimba player Jon Hart and flautist Theo Travis.
Matteo Milani: When did you start working with generative music?
Peter Chilvers: In the 90’s I worked as a software developer on the ‘Creatures’ series of games. When we started on Creatures 2, I was given the opportunity to take over the whole soundtrack. The game wasn’t remotely linear – you spent arbitrary amounts of time in different locations around an artificial world, so I wanted to create a soundtrack that acted more as a landscape. I ended developing a set of ‘virtual improvisers’, constantly generating an ambient soundscape in the background – it was quite involved actually, with its own simple programming language, although little of that was visible to the user.
[…] Peter chose to use his background in improvised music to create an array of “virtual musicians” that would play along to the action on screen. Each composition in Creatures contains a set of “players”, each with their own set of instructions for responding to the mood of the norns on screen.
Peter was able to generate much more interesting effects using recorded instruments rather than using General MIDI sounds generated by a soundcard, which can often be quite restrictive. This meant that he could take advantage of the many different ways that a note on a “live” instrument can be played – for example, on a guitar the sound changes greatly depending on the part of the finger used to strike a string, and on a piano when one note is played, all the other strings vibrate too. Also by altering the stereo effects, he could fatten the sound at certain times.
He also made use of feedback loops within the soundtrack. Feedback loops were first experimented with in the 1970s – if any of you can remember Brian Eno, you may be interested to know he composed most of his music then using this method. The idea is that you play a track and record it into RAM (onto a tape back in the 1970s). After about a short while (around 8 seconds in Creatures 2), the loop starts and the original sounds are played back so the composer carries on creating sounds in response to what’s gone before.
Behind the scenes, scripts control the music engine and set the volume, panning and interval between notes as the mood and threat changes
[via gamewaredevelopment.co.uk/]
![]() |
.
Matteo Milani: Why did you choose the Apple platform to develop the applications?
Peter Chilvers: I’ve been a huge fan of Apple products for a long time, and their timing in releasing the iPhone couldn’t have been better. Bloom actually existed in some form before the iPhone SDK was announced – possibly before even the iPhone itself was announced. From the second we tried running the prototype, it was obvious that it really suited a touch screen. And Apple provided one! The difficulty developers have faced with generative music to date has been the platform. Generative music typically requires a computer, and it’s just not that enjoyable to sit at a computer and listen to music. The iPhone changed that – it was portable, powerful and designed to play music.
Matteo Milani: Who designed the visualizations of Bloom? Eno himself?
Peter Chilvers: It was something of a two way process. I came up with the effect of circles expanding and disappearing as part of a technology experiment – Brian saw it and stopped me making it more complex! Much of the iPhone development has worked that way – one of us would suggest something and the other would filter it, and this process repeats until we end up with something neither of us imagined. Trope, our new iPhone application went through a huge number of iterations, both sonically and visually before we were happy with it
Matteo Milani: What kind of algorithms define Bloom’s musical structure? Are they specifically based on Brian’s requests or just an abstraction based on his previous works?
Peter Chilvers: Again, this is something that went back and forth between us a number of times. As you can see, anything you play is repeated back at you after a delay. But the length of that delay varies in subtle, but complex ways, and keeps the music interesting and eccentric. It’s actually deliberately ‘wrong’ – you can’t play exactly in time with something you’ve already played, and a few people have mistaken this for a bug. Actually, it was a bug at one point – but Brian liked the effect, and we ended up emphasising it. “Honour they error as a hidden intention” is something of a recurring theme in Brian’s work. A forthcoming update to Bloom adds two new ‘operation modes’, one of which was designed specifically to work with the way Brian prefers playing Bloom.
Matteo Milani: Does the graphic and audio engine include audio and video standard libraries or you wrote your own classes?
Peter Chilvers:I’ve built up my own sound engine, which I’m constantly refining and use across all the applications. It went through several fairly substantial rewrites before I found something reliable and reusable.
![]() |
.
Matteo Milani: Is all the code in Objective C or did you use any external application?
Peter Chilvers: It’s all Objective-C. I hadn’t used the language before, although I’d worked extensively in C++ in the past. It’s an odd language to get used to, but I really like it now
Matteo Milani: Is Bloom sample based? What is music engine actually controlling (e.g. triggering, volume, panning, effects)? What about the algorithmic side of the music engine?
Peter Chilvers: Bloom is entirely sample based. Brian has a huge library of sounds he’s created, which I was curating while we were working on the Spore soundtrack and other projects. It’s funny, but the ones I picked were just the first I came across that I thought would suit Bloom. We later went through a large number of alternatives, but those remained the best choices. The version of Bloom that’s currently live uses fixed stereo samples, but an update we’re releasing soon applies some panning to the sounds depending on the position of each ‘bloom’ on screen. It’s a subtle effect, but it works rather well.
Matteo Milani: Would you like to describe your actual and next projects?
Peter Chilvers: I’ve been involved in two new applications for the iPhone: Trope and Air. Both Apps were intended to be released simultaneously. Trope is my second collaboration with Brian Eno, and takes some of the ideas from Bloom in a slightly different, slightly darker direction. Instead of tapping on the screen, you trace shapes and produce constantly evolving abstract soundscapes. Air is a collaboration with Irish vocalist Sandra O’Neill, and is quite different to Bloom. It’s a generative work centred around Sandra’s vocal textures and a slowly changing image. It draws heavily on techniques that Brian has evolved over his many years working on ambient music and installations, as well as a number of the generative ideas we’ve developed more recently.
![]() |
.
I have just had some interesting news: Trope has been approved, it should be in the App Store very soon! www.generativemusic.com
“Trope is a different emotional experience – more introspective, more atmospheric. It shows that generative music, as one of the newest forms of sonema, can draw on a broad palette of moods.” – Brian Eno.
Brian Eno discussing Generative Music at the Imagination Conference, 1996