V2_Institute for the Unstable Media in Rotterdam is internationally recognized like one of the main centres for research and practice for what concerns new media art and technological and artistic experimentation. It has recourses of several events and workshops to give life to wide discussions about modern age and to propose the most interesting matters of the world creative scene.
On the last 13 th December 2007, the research centre V2_ presented the event Test_Lab: Live_Coding. Code is intended like interface for real-time improvisation and performances among the artists on the net. Live_Coding allows the expressive control of the software code in real time over the events and it enables the live improvisation by using the code as the interface for the interaction between different performers, and also between live coders and the audience.
Test_Lab: Live_Code has represented the occasion to propose some performances and real experiences of programming languages Live Coding, debates and presentations about techniques and the aesthetic of Live Coding, as well as an experiment of live coded Augmented Reality. In order to understand the activity of Test_Lab inside of V2 and how this event was given birth, what the developments of this technique and this relatively recent theory are, I had the occasion to talk to 2 of the organizers: Michel van Dartel , project manager at V2_, and Artem Baguinski , who has been collaborating at the V2_lab since 2000 like software engineer, working with resident artists, running workshops and some personal projects.
![]() |
.
Silvia Scaravaggi: What kind of activities Test_Lab develops and supports?
Michel van Dartel: The core idea of Test_Lab is to create an informal setting, or platform, for artists and developers to ‘test’ their latest artistic Research and Development (aRt&D) work. For each edition, we invite artists working within a specific aRt&D theme to bring their prototypes to Test_Lab for a live demonstration and, preferably, for the audience to try out out and play with. In this sense, Test_Lab always works in two directions: Artists use the Test_Lab audience as a ‘test panel’ to receive feedback on their ideas and work before it enters the museums and festivals or before moving on to a next development stage. At the same time, the Test_Lab audience learns about the latest in artistic R&D and has buckets of fun trying things out themselves.
Regarding the format of test_Lab we like to experiment a lot, and let the format depend on what we’d like to show or who we’d like to invite. In the past, we’ve done things like taking our audience out onto the street to play games; had concerts, Second Life performances, dj performances, and audience jam-sessions; relocated Test_Lab to the Erasmus Medical Center; and included fashion shows, theater, and various mini-workshops in the program.
For each edition of Test_Lab we develop a different theme, and with those themes we try to tap into what we think are important developments and issues in current aRt&D. Moreover, the themes come forth from discussions in the V2_ Lab; questions that arise from projects that we’re working on, technologies or approaches that we think are interesting, or things that we’d like to learn more about in view of an upcoming project at V2_. In past editions, we’ve focused on broad themes like technologies for the performance arts, technology in fashion, and the notion of play in electronic art, but also on more specific topics, such as physical audio interfaces and, most recently, Live Coding.
![]() |
.
Silvia Scaravaggi: How did you two get involved in the Live Coding experience? And how did reached the event Test_Lab starting from V2_?
Michel van Dartel: Test_Lab: Live_Coding is a typical example of how a Test_Lab theme evolved from V2_Lab discussions. For years, Artem has been involved in Live_Coding and has been bugging the other Lab members with info on the software paradigm and how it is revolutionary in its approach. But it was only until we recently encountered some difficulties in the development of one of our projects, a collaboration between Rnul, Carla Mulder and V2_, that we decided to turn it into a Test_lab theme. The problem that the project team encountered had to do with making an Augmented Reality environment more dynamic, so that it could be applied to improvised theater performances. Artem, logically, offered Live Coding techniques as a possible solution to the problem, and from there on we decided to present the project, and its first steps towards a Live Coding solution, to the Test_lab audience within a context of other Live Coding projects and performances. In this way we hoped to receive feedback that would be beneficial to the project’s further development and, at the same time, to introduce the audience with the Live Coding software paradigm.
Silvia Scaravaggi: How did you develop the project? What kind of projects and performances did you present during “Test_Lab: Live_Coding”?
Michel van Dartel: The evening was opened by Florian Cramer of Piet Zwart Institute who gave a very clarifying and entertaining presentation on the history of Live_Coding. Besides introducing what is understood by Live Coding, he also related it to the philosophy of early free jazz and worked his way from there, through early electronic composers, towards the groundbreaking Live Coding work of TOPLAP. After Florian’s introduction, Live Coding audio performance collective Powerbooks Unplugged introduced their approach and, following, gave a beautiful unplugged performance with their Powerbooks, positioned spread-out through the audience
![]() |
.
After PBUPs performance, all chairs were taken out to make room for a combined presentation by interaction developers Rnul , theatre maker Carla Mulder , and V2_Lab’s Jan Misker and Jelle van der Ster. They demonstrated the result of a week of trying to apply Live Coding techniques to an Augmented Reality environment. First they handed out 3D glasses to the audience, so everyone could see the 3D projection of the Augmented Reality installation, and then Carla Mulder presented what they had developed so far and how Carla was planning on using it in her theatre performance. In fact, Carla ‘s presentation had a lot of a theatre performance in itself, making it a perfect test of what had been developed during the week before.
Following was a parallel session in which the audience could choose to attend either one of the mini-workshops on Live Coding programming languages SuperCollider, lead by PBUP, and Fluxus, lead by Artem Baguinski, or to play with the Augmented Reality installation. The evening was closed with drinks and a very dance-able audio performance, based on Live Coding environment MAX/MSP, by Susie Jae, aka Jean van Sloan.
Silvia Scaravaggi: In which way did you two collaborated? How could you combine your different professional and creative profiles?
Michel van Dartel: Although I curated the program, Artem was definitely the main source of information during the event’s preparation. First he gave me a personal introduction course to Fluxus and then directed me to a collection of relevant papers and websites on Live Coding. From there on, Artem focussed on preparing his contribution to Test_Lab, the Fluxus mini-workshop, and I got in contact with Florian Cramer and some TOPLAP people; Alex McLean, Dave Griffiths, Adrian Ward, and Julian Rohrhuber, to further develop the program. Although not all of them could make it to Rotterdam for the event, they were a big help in setting up the program by getting me in touch with the right people. Of course, with any decision regarding the program I consulted Artem’s expertise on the topic
![]() |
.
Silvia Scaravaggi: Which were the main themes of presentations and debates?
Michel van Dartel: It’s funny that you call them ’emerging’ themes, because the evening slightly surprised me in the issues that came up during the debates. While we had set up the evening around the live improvisation aspect of Live Coding, and I had expected much discussion on things related to that (such as the discussion on whether actually showing the ‘raw code’ during Live Coding performances is essential), the discussion got most heated up regarding the definition of Live Coding. For instance, according to some, what was used to make the Augmented Reality experience dynamic in the demo by Rnul, Carla Mulder, and V2_Lab did not fall within the definition of Live Coding, since the calls to the visuals were performed live but the visuals were not live-generated. According to others their approach did fall within the definition of Live Coding, since there was no need for compiling or rendering to make dynamic changes in the output of the software.
Artem Baguinski: Live Coding isn’t always about making code and media from absolute zero: although you definitely may start from scratch, often you’d use pre-written scripts as well as resources – textures, fonts, samples etc. The crucial part is how you then manipulate them – once you’ve started editing the code that uses those resources and execute the new version of the code on-the-fly and on-stage , you are coding live. In this case, what was coded live was behaviour and not appearance of objects, that’s probably why it wasn’t so apparent.
Michel van Dartel: Furthermore, there was a lot of discussion on the differences between using GUIs and Live Coding, but maybe Artem can explain this better
![]() |
.
Artem Baguinski: I guess in my corner of the floor discussion went somewhat different way: I found myself discussing some instruments frequently used in Live Coding – ChucK, Fluxus and Supercollider and how various aspects of their languages and GUIs help to code (be it live or not). Neither using GUIs, nor visual programming (as in PureData or Max/MSP or StarLogo TNG [1]) is foreign to Live Coding, it is the attitude – the desire to make public your thought process by displaying the code behind the media, that counts
Some practitioners of Live Coding are constantly in search for more tangible ways to represent, create and input algorithms then traditional text-based programming languages / editors. Some examples are BetaBlocker [2] and Al-Jazari [3] by David Griffiths, and Rubik’s Cube DJ [4] by Douglas Edric Stanley.
[2]: (BetaBlocker): http://www.youtube.com/watch?v=-G8qDTYuhOM
[3]: (Al-Jazari): http://www.youtube.com/watch?v=Uve4qStSJq4
[4]: (Rubik’s Cube DJ): http://www.youtube.com/watch?v=4_Ta6TIJTQQ
This might not be approved by TOPLAP as “official” live coding instruments, but they do show existing interest in alternatives to plain text.
![]() |
.
Silvia Scaravaggi: I’d like to go deeper inside of the Live Coding experience by talking about its philosophy and its origins. How does it work? Where does it come from?
Artem Baguinski: Live Coding consists of programming in front of the audience, while the program one writes is running. Most often the program being (re)written generates or processes sound or visuals – “rich media”. So, what happens is – the performer has some sort of system that can generate or process rich media, and the way it actually does that can be specified by a program of some sort.
The performer starts the system and opens a program for it, or one could start from scratch, in an editor. OK, he/she thinks, I want this or that to happen and here is how I’d program that and he or she types in the program or modifies the existing one and starts it up. Now the audience and the performer can see or hear the result of the program that has just been written. It could be that the program doesn’t do exactly what was intended – due to an error or some misunderstanding; or the program does exactly that, but the performer now wants something different, maybe more complex, but still based on the original idea.
So he/she goes back to the editor and modifies the program and starts it again and repeats the process in the course of the performance again and again. Now, to make it more interesting for the audience, care is usually taken to make the transitions between the old version and the new seem smoother, or even that the original program doesn’t really stop, but just changes over time. Different artists use different tools and different ways to make Live Coding feel like a continuous coherent performance and not like a series of experiments. In practice one doesn’t often start really from scratch – you’ve got some code fragments prepared, some ideas tried out, maybe some library (code that you use without it actually appearing in the editor).
![]() |
.
Michel van Dartel: In the mini-workshops Jan-Kees van Kampen (of Powerbooks Unplugged) illustrated this very clearly by starting up a simple sine tone in SuperCollider, typing the command for sine and some parameters between brackets to define its variables, the amplitude etcetera. Then while fiddling with the parameters a bit, changing the height of the tone and such, and by adding some more commands to the line, introducing temporal manipulations on the tone and such, the tone changes into a cute repetitive pattern. He explains that this is the type of thing that he prepares for a performance, then during the performance he doesn’t write every sound from scratch, but prepares some lines of code like he did just now, and simply pastes this into the running editor. The pasting of these pre-prepared code fragments and the fiddling with the parameters in the fragments is used to improvise with the other Live Coders or whoever
Silvia Scaravaggi: Which is Live Coding place inside of the contemporary art scene?
Artem Baguinski: On the theoretical side: self-criticising, self-defining, self-searching – there is a lot of reflection going on: what constitutes Live Coding, is it an autonomous form of art, or is it a stylistic addition to more traditional audio visual performance? What is the importance of making the code visible and does it matter which form the code takes on the screen? What does it mean to the artist and the audience?
On the practical side: experimental and daring. The theoretical discussion arise from praxis and re-evaluation of own experience and motivation but doesn’t constrain what people actually do. You can say the theory of Live Coding, if there is such thing, is descriptive rather than prescriptive – it analyses what’s going on and attempts to reinvent itself accordingly, rather then give guidelines on how Live Coding should be exercised.
![]() |
.
Silvia Scaravaggi: How does Live Coding modify the role of a programmer during the creative approach?
Artem Baguinski: In the context of performance I’m not sure it does – there are artists who can code and practice performing arts, both activities are creative endeavours and eventually some of such artists find themselves combining the two – by creating their own instruments or coding during the performance or both. Rather then changing their role, Live Coding allows them to demystify it – look, I am programming now and here is how I do it and here is what my code looks like and here is what it does – this all at the same time as a coherent and completely open performance.
However, when we at V2_Lab, use Live Coding practices in an augmented reality art production process, I think the character of our, engineers’ involvement changes – it becomes much more direct, immediate – due to the much faster feedback that we and the artists have. Production of complex electronic installations is always an iterative process of trial-error-reevaluation, and the ability to try various ideas out on-the-fly, as they arise in a brainstorming / improvisation session, transforms the engineer into a sort of organic interface to the technology or an augmented actor – depending on the perspective taken.
Silvia Scaravaggi: How would you describe the kind of interaction-interactivity Live Coding is able to develop?
Artem Baguinski: Ideally, what we aim at, it is the interaction of the physical environment surrounding the artist and the software system, on a very intimate level of code constituting this very system. There are multiple feedback loops here because the input/output of the software system connects it to the environment, just like artists own sensory system does.
![]() |
.
Silvia Scaravaggi: Inside of Live Coding, which is the meaning of terms like improvisation, live performance and interface?
Artem Baguinski: Many artists or programmers make art with computer code – look at runme.org or at the obfuscated C-code contest – I consider some of the entries there sort of code-poetry. Code based art is very holistic – often its beauty is only apparent when seen against a background of a certain computer subculture or language. Obfuscated code might use unconventional formatting to represent the structure or some properties of the algorithm the very same code implements, just like the Tale poem in Carol’s Alice in Wonderland.
And just like you’ve got to understand English to see the link between its content and its mouse-tail-like shape, you’ve got to understand code and often be familiar with programmers’ culture, folklore and mythos, to see all the “inside jokes”, references and analogies.
By creating the code on the fly and modifying it as it runs, live coder gives the audience a chance to get a feeling as to what it means, even if the language is unfamiliar or text is barely readable. By following the editing actions, fixing mistakes, hesitation and scrolling through the code, and hearing / seeing the result of the code at the same time, the audience comes somewhat closer to the holistic appreciation of what is going on – since they not only experience the media output, or only see the code behind it – but experience both simultaneously as they come to life and evolve. Improvisation is the goal of live coding – that’s why you do it live, that’s why you create and improve the instruments to get as much out of the way while remaining useful and powerful. And in fact, the interface has dual use: for performer the editor is the interface to the instrument, it can constrain or empower. On the other hand, for the audience the editor is yet another interface to the artist’s mind, compensating for the physical body often being hidden behind the laptop.
![]() |
.
Silvia Scaravaggi: What happens between a live coder and who interacts with him, another artist, the audience, a dj or a performer?
Artem Baguinski: Technically the simplest but otherwise the most important connection with other performers is the one that goes through ones brain – just like in any collaborative improvisation you watch / listen to your co-performers and let them influence your ideas as to where to go next. Often you’d know upfront what you can expect from the collaborators, it helps to prepare your own material that would work well with that. But again – you can start blank and work toward coherence on-the-fly. But since we’ve got computers, we can let them do some boring tasks, freeing ourselves for more abstract and high level “linking”. Thus when working with a musician you could make a computer analyse the microphone input and turn the sound into input for the code. When collaborating with other computer performers you could send each other signals or even fragments of code over the local wired or wireless network.
Michel van Dartel: I think that in terms of interaction between a Live Coder and other kinds of artists and/or the audience, the strength of Live Coding is not so much in that it produces completely different audio or visuals than could be achieved with standard software and hardware… Live Coding’s real strength is in that it provides absolute freedom in the manipulations that can be carried out there and then at that moment, and therefore has a great impact on the possibilities for live improvisation, and thus results in a completely different improvisation (interactive) process than would be possible using standard software and hardware. In other words, it basically enlarges the creative space by mineralising the parameters defining that space.
![]() |
.
When you use a sequencer to manipulate a tone you are restricted to the functions of the faders and knobs and predefined procedures of the sequencer, with Live Coding there are basically no such restriction, and any manipulation that you’d like to do can be coded at that moment, there and then, providing a much larger freedom to improvise. As a musician I find this very appealing, although it seems to take a while to develop the skills required, since, so far, i’m still only fiddling with the parameters of simple sine tones… On the other hand, people spend lifetimes mastering the piano, so why not spend as much time mastering your laptop?.