Sim­u­lat­ing Synaes­the­sia in Real-Time Per­for­mance


Session Title:

  • Visual Effects Remixed

Presentation Title:

  • Sim­u­lat­ing Synaes­the­sia in Real-Time Per­for­mance



  • Panel: Visual Effects Remixed

    In this paper the au­thor will de­scribe and show ex­am­ples of his live au­dio-vi­sual work for 3D spa­tial en­vi­ron­ments. These pro­jects use mo­tion track­ing tech­nol­ogy to en­able users to in­ter­act with sound, light and video using their body move­ments in 3D space. Spe­cific video ex­am­ples of one past pro­ject (Vir­tual DJ) and one cur­rent pro­ject  (Vir­tual VJ) will be shown to il­lus­trate how sub­jec­tive and flex­i­ble user in­ter­ac­tion is en­abled through a com­plex but pre­dictable map­ping of 3D space to media con­trol. In these pro­jects au­di­ence mem­bers can in­ter­act with sound, light and video in real-time by sim­ply mov­ing around in space with a tracker in hand. Changes in sound, light and real-time vi­sual ef­fects can be syn­chro­nized with changes in sound and or light (i.e. music vol­ume = light bright­ness = video opac­ity). These changes can be dy­nam­i­cally mapped in real-time to allow the user to con­sol­i­date the roles of DJ, VJ and light de­signer in one in­ter­face. This in­ter­ac­tion model sim­u­lates the ef­fect of synaes­the­sia, in which cer­tain peo­ple ex­pe­ri­ence light or colour in re­sponse to mu­si­cal tones.

PDF Document: