“Autolume: Automating Live Music Visualization” presented by Kraasch and Pasquier


Session Title:

  • Sound, Data and AI

Presentation Title:

  • Autolume: Automating Live Music Visualization




  • Deep Learning is becoming increasingly more accessible for artists, leading to generative and discriminative models being used for artistic expression. We distilled approaches found in research and installations relating to GANs into a live VJing program. We propose an interactive tool for visualizing music using live audio feature extraction and a MIDI controller to allow artists to accompany live performances. Following previous approaches for offline audio-reactive visuals using GANs, we map the amplitude, onset strength and notes in the music to influence the images generated. Furthermore, we use findings in interpretable GANs and techniques in Network Bending to incorporate a MIDI controller that is common for VJs. This allows the artist to adjust the visuals with a known interface.