Earlier this year at the SXSW Interactive festival, the design and development agency Deloitte Digital presented an immersive generative music installation called the “Audience Reactive Composition” (ARC). Created by the Dave & Gabe Interactive Installation Studio, the project was described in the online magazine The Verve as something “you would see at a Daft Punk concert fifteen years for now, with Tron-like neon lights and all manner of rotating spheres and illuminating touch-sensitive cylinders”. Users can play the ARC’s music through physical interaction with its five sculpted instrument interfaces equipped with touch sensors, large glowing trackballs and hand-sized flat joysticks. By manipulating these instruments, players change the rhythm, melody, chords and sonic intensity of various elements in an algorithmic-based piece of music created by the electronic musician known as RAC (André Allen Anjos). The result is played back over a system of 20 speakers that encircle the central instrument station. Synchronized light animations also appear in the surrounding environment in response to the resulting music.
The installation was created with the intention of inviting people who might normally be intimidated by playing a traditional musical instrument to become part of a music creation process and collaborate in real time with other players at the table, forming a sort of impromptu band of remix DJs. The promotional video for the project states that the resulting music “isn’t a static product, but a living organism.” I haven’t been able to find any technical design details, but the project illustrates an intriguing direction for generative performance systems in a live group setting.