Domestic Light: Data Hack Day: a hands-on introduction and discussion of the multispectral light data set and compositional possibilities




Title:

Domestic Light: Data Hack Day: a hands-on introduction and discussion of the multispectral light data set and compositional possibilities

Organiser/Presenter(s):

Statement:

The Domestic Light Data Hack Day led by lead artist Ian Winters and data scientist Weidong Yang will provide a hands-on introduction to the use and generation of the year+ long multi-spectral light intensity data set generated by the Domestic Light project. As a general framing we are artists working exploring what artistic, performative and compositional possibilities are afforded by collaboratively generated durational environmental data set, using a “hack day” format. We hope to this data sharing can spark other collaborative uses, extensions and ideas.

The project data is generated by a network of multispectral light sensors housed in collaborators homes around the globe with a goal to have a sensor in each time zone on earth that record and stream the color of the light of home in real time. See https://domesticlight.art/locations for current locations.

The data set consists of 11 readings of narrow band spectra from near UV to Near IR recorded on a 10 second interval at the locations that are part of the project. the intensity of light across 11 discrete spectral data create use of the year+ long multi-spectral light color data set created by the Domestic Light project.

The sensor boards have additional STEMMA QT I2C ports on the board to allow the easy addition of other environmental sensors such as a greenhouse gas or VOC sensor by the host and the sensor code is available under an open source license to support such custom additions.

Workshop topics include:

1) An introduction to the sensor platform which uses an ESP32S3 and the AS7341 sensor.

2) Guide to accessing the data set using a series of python-based scripts both locally and via the project’s API endpoint.

3) An introduction to translating the data to an Open Sound Control (OSC) stream for use in music and video tools such as Max-MSP, Pure Data, Supercollider, Isadora and Touchdesigner as well as template files.

4) Demonstration of the multi-spectral LED light developed to replay the data set.

5) A discussion and brainstorm of what tools would make this and other data sets more usable as a performative tool (such translating the data set to an analog control voltage or adding greenhouse gas sensing capabilities.

6) Hands-on work time with the data set to sketch out ideas

7) A final show and tell of outcomes.

Category: