The “game” is described as being about “news cycles, vicious cycles, infinite cycles.” To me, it’s an inspiring example of intentional…I might even say “activist”…web-based art, especially given the outcome of the recent national election and thinking about the media’s role in feeding fears and influencing voters’ choices. I initially felt that Case was being too extreme in the portrayal of media sensationalism, but his blog post about this project sheds a bit more light on where he’s coming from. He describes how his fellowship with the PBS program Frontline provided him with a deeper perspective on the lure of “clickbait” and how journalists struggle with it – the sensationalist stories end up getting the most attention from the general public. Even if you don’t agree with his viewpoint about “the media”, the experience of the game provides an interesting catalyst for conversations about the the effect of these cycles on our society and culture. It’s also great that Case has made the code for this (and his other projects) openly available for other developers to play with and freely remix.
Earlier this year at the SXSW Interactive festival, the design and development agency Deloitte Digital presented an immersive generative music installation called the “Audience Reactive Composition” (ARC). Created by the Dave & Gabe Interactive Installation Studio, the project was described in the online magazine The Verve as something “you would see at a Daft Punk concert fifteen years for now, with Tron-like neon lights and all manner of rotating spheres and illuminating touch-sensitive cylinders”. Users can play the ARC’s music through physical interaction with its five sculpted instrument interfaces equipped with touch sensors, large glowing trackballs and hand-sized flat joysticks. By manipulating these instruments, players change the rhythm, melody, chords and sonic intensity of various elements in an algorithmic-based piece of music created by the electronic musician known as RAC (André Allen Anjos). The result is played back over a system of 20 speakers that encircle the central instrument station. Synchronized light animations also appear in the surrounding environment in response to the resulting music.
The installation was created with the intention of inviting people who might normally be intimidated by playing a traditional musical instrument to become part of a music creation process and collaborate in real time with other players at the table, forming a sort of impromptu band of remix DJs. The promotional video for the project states that the resulting music “isn’t a static product, but a living organism.” I haven’t been able to find any technical design details, but the project illustrates an intriguing direction for generative performance systems in a live group setting.
Inspiration of the week discovered while conducting research for my final paper in the Digital Cultures course – the iOS game “Quarta” created by Peter Chilvers and Brett Gilbert.
Some intriguing aspects of this game: it’s easy to learn quickly, features a simple, intuitive interface with a clean visual design, and it’s quite addictive and replayable – you don’t get bored with it after a few matches. It also provides a soothing, ambient soundtrack that is generated based the number and spatial arrangement of the black and white pieces on the board (and possibly the color of the various areas as well, though I haven’t yet noticed a correlation there), making it an interested hybrid of a game and a musical instrument. Chilvers has a background in game development and generative music systems, having worked on projects such as the original Creatures release and the Will Wright game Spore, for which Chilvers collaborated with musician Brian Eno to create an ever-changing generative soundtrack. Chilvers and Eno have also created the generative music system iOS apps Bloom and Scape – more information about all of their app collaborations is available at www.generativemusic.com. In my opinion, Quarta is a reminder of how engaging a simple and well-executed game concept can be.
As spotted recently on the “prosthetic knowledge” tumblr site – the Holographic Whisper three-dimensional spatial audio speaker system. (The slightly-over-the-top-futuristic-tech-style promotional video is included below…)
The creators propose a “sound-point method” that enables control of “aerial audio distributions more flexibly and more precisely in comparison with conventional superdirectional (sound-beam) loudspeakers. This method can generate and vanish the sound sources freely in air. These point sound sources can deliver private messages or music to individuals.” Unfortunately, there is no clear link to the mentioned research paper, and it doesn’t look like a prototype has been developed at this point. But it certainly warrants further exploration – I’ve been intrigued for awhile with the idea of creating a sonic installation in a space that could record the voices of attendees, and then play back segments of those recordings to future attendees with the audio being targeted (to be heard) at the same spatial location that the voices were recorded…a sonically “haunted” room filled with the voices of ghosts from past visitors.
Along with all of the great examples available on the (newly re-vamped?) OpenProcessing site, I stumbled upon this inspiring collection in Tumblr – “Experiments in Processing” http://p5art.tumblr.com/. I’m not sure who’s responsible for the site, and there are many sketches that are quite similar to those posted on OpenProcessing. But for me, this is a particularly attractive group of intriguing experiments to explore and learn techniques from. In particular, the fireflies example turned me onto Dan Schiffman’s “Metaballs” coding challenge tutorial on YouTube – high on my “must watch soon” list.