“Virtual 3d Sound Sculptures for Realtime Performance: Reapropration of Game Engines for Visual Music Performance” presented by Blanton
Symposium:
- ISEA2016: 22nd International Symposium on Electronic Art
- More presentations from ISEA2016:
Session Title:
- Interactivity, Minds and Bodies
Presentation Title:
- Virtual 3d Sound Sculptures for Realtime Performance: Reapropration of Game Engines for Visual Music Performance
Presenter(s):
Venue(s):
Abstract:
Working with game development environments and custom built software, Microplex works with three virtual environments that produce sound. Using the conceptual framework of commonality in structures between Urban Environments, Computer Processors, and the connectivity of the Human Brain, Microplex is a real time cinema/ visual music performance 15 min. in length and an installation as a piece of real time animation.
MICROPLEX is an electro acoustic composition for percussion and realtime visualization. The work is based on transcripts of talks given by Benjamin Bratton[1] as well as Anil Bawa-Cavia[2] comparing micro biological structures, complex dense networks (such as the human brain and micro processors), and large scale human urban growth. The work is in three movements that each have distinct sonic and visual environments associated with them. The first environment is a rendering of the Intel Montecito[3] chip that has been extruded in three dimensional space to create an urban landscape. The second movement is based on connectivity of the human brain visualizing the macro level connectivity. And the third movement visualizes cellular growth and life cycles. All three movements are based around the idea of virtual 3d sound sculptures that, when visualized, produce sound.
Using four small drums, a visual and sonic representation of each data set is played in real time. Custom software receives input in the form of audio signal from each drum and excites specific parts of the visuals to create the sonic output. Each of the three movements will present a unique visual and sonic representation.
Based around the idea of visualization of live audio feeds from each drum, the system uses both the live audio signal for visualization as well as software side threshold detection for real time triggering of events within the 3D scene. Multiple processes are then used to extrapolate audio information from the visualizations. The first process renders the scene into a two dimensional matrix that maps scene luminosity to a bank of sine tone generators. The second process is to track three dimensional points as real time x, y and z coordinates and drive synthesizers with that data. The third process uses topographical scan line processing to scan the surface of the objects and derive sound.
Excerpts of this work have been shown at the 2015 Transplanted Roots: Percussion Symposium in Montreal Canada[4], The 2015 Understanding Visual Music conference in Brasilia Brazil[5], Gray Area Arts in San Francisco[6], and the Mckinney Avenue Contemporary in Dallas Texas as a part of the 2015 Dallas Video Festival[7].This work has been developed as custom software by Andrew Blanton. The framework allows for rapid prototyping and construction of audio visual works that both visualize sound as well as extract sound from visuals. The principal idea of the framework is to create feedback between multiple systems and insert the performer into the feedback loop to control it in real time. The work is based on previous work done in this area by groups such as NOISEFOLD[8], Semiconductor[9] and the Vasulka’s[10] among others.