In the course of my master thesis we (my colleague Naida & me) created a prototype for „technology-mediated audience participation“. So what does that stand for?
The aim was to create a piece of technology that allows the audience to take action and influence the structure of a concert.
In our setup the movement of the audience was tracked by a camera. In areas where a lot of activity occured (dancing, waving hands …) more parts of the initally dark projection were revealed. But in our case the projection was not, as with most visuals, behind the musicians, but between musicians and audience.
Each element that was added to the projection by the active audience also added / changed the backing track (beats, percussions, basslines). The musicians, a singer and a violinist, improvised their play along the changes of this dynamic set.
After a certain amount of revealed elements in one area of the projection the space was „unlocked“. Each unlocked area of the canvas fell down to the floor. This allowed the audience to see more of the musicians the more they interacted.
When the last part of the canvas fell to the ground the performance also came to an end.
Prototype Codebase: Processing / Java
Tracking: OpenCV for Processing
Sound: MIDI Messages sent to Ableton Live
Hardware: Servo-Motors controlled from Arduino