Simulating Synaesthesia in Real-Time Performance
Symposium:
Session Title:
- Visual Effects Remixed
Presentation Title:
- Simulating Synaesthesia in Real-Time Performance
Presenter(s):
Venue(s):
Abstract:
Panel: Visual Effects Remixed
In this paper the author will describe and show examples of his live audio-visual work for 3D spatial environments. These projects use motion tracking technology to enable users to interact with sound, light and video using their body movements in 3D space. Specific video examples of one past project (Virtual DJ) and one current project (Virtual VJ) will be shown to illustrate how subjective and flexible user interaction is enabled through a complex but predictable mapping of 3D space to media control. In these projects audience members can interact with sound, light and video in real-time by simply moving around in space with a tracker in hand. Changes in sound, light and real-time visual effects can be synchronized with changes in sound and or light (i.e. music volume = light brightness = video opacity). These changes can be dynamically mapped in real-time to allow the user to consolidate the roles of DJ, VJ and light designer in one interface. This interaction model simulates the effect of synaesthesia, in which certain people experience light or colour in response to musical tones.