“ixi lang: A Constraint System for Live Coding” presented by Magnusson
Symposium:
- ISEA2010: 16th International Symposium on Electronic Art
- More presentations from ISEA2010:
Session Title:
- Musical Devices
Presentation Title:
- ixi lang: A Constraint System for Live Coding
Presenter(s):
Venue(s):
Abstract:
In the late 1990’s a new performance practice appeared in the more experimental venues of the musical world, where performers would step onto stage with a rather strange musical instrument, the laptop. These performance contexts, in pubs and clubs, were primarily designed for pop or rock bands. Instead of locating themselves behind the mixer, where the best sound is normally to be heard, they placed their equipment on the stage, typically on a table, and presented some rather refreshing and novel musical worlds. Whilst the audience appreciated the texturally sophisticated world of sound these instruments were capable of, the performance aspect of the music suffered. What were these musicians actually doing behind these screens on the stage?
A decade later some solutions had evolved, addressing this lack of coupling between the performer’s gestures and the sound emitted by the speakers. One of them is VJing. By analyzing the sound signal – typically through Fast Fourier Transform Analysis or even OSC messages sent from the sound generating software – the VJ is able to generate visuals that connect and represent the sound in endless interesting, yet arbitrary, ways. Another solution is represented by a field often called NIME (New Interfaces for Musical Expression), with university courses and conferences dedicated to the investigation (see www.nime.org). Here various interfaces have been designed that allow the performer to use her body, in a manner inspired by acoustic instruments, to control a digital sound engine. The third response to the problem of the exclusiveness of computer music performance is live coding.