Noises & Signals

Contemplations on creativity in our digital age

Page 2 of 4

“We Become What We Behold”

I’ve been a fan of Nicky Case’s work for awhile, especially his Parable of the Polygons and Neurotic Neurons projects. His most recent work, “We Become What We Behold“, is the latest addition to his thoughtful and entertaining online collection.

The “game” is described as being about “news cycles, vicious cycles, infinite cycles.” To me, it’s an inspiring example of intentional…I might even say “activist”…web-based art, especially given the outcome of the recent national election and thinking about the media’s role in feeding fears and influencing voters’ choices. I initially felt that Case was being too extreme in the portrayal of media sensationalism, but his blog post about this project sheds a bit more light on where he’s coming from. He describes how his fellowship with the PBS program Frontline provided him with a deeper perspective on the lure of “clickbait” and how journalists struggle with it – the sensationalist stories end up getting the most attention from the general public. Even if you don’t agree with his viewpoint about “the media”, the experience of the game provides an interesting catalyst for conversations about the the effect of these cycles on our society and culture. It’s also great that Case has made the code for this (and his other projects) openly available for other developers to play with and freely remix.

The “Audience Reactive Composition”

The Audience Reactive CompositionEarlier this year at the SXSW Interactive festival, the design and development agency Deloitte Digital presented an immersive generative music installation called the “Audience Reactive Composition” (ARC). Created by the Dave & Gabe Interactive Installation Studio, the project was described in the online magazine The Verve as something “you would see at a Daft Punk concert fifteen years for now, with Tron-like neon lights and all manner of rotating spheres and illuminating touch-sensitive cylinders”. Users can play the ARC’s music through physical interaction with its five sculpted instrument interfaces equipped with touch sensors, large glowing trackballs and hand-sized flat joysticks. By manipulating these instruments, players change the rhythm, melody, chords and sonic intensity of various elements in an algorithmic-based piece of music created by the electronic musician known as RAC (André Allen Anjos). The result is played back over a system of 20 speakers that encircle the central instrument station. Synchronized light animations also appear in the surrounding environment in response to the resulting music.

The installation was created with the intention of inviting people who might normally be intimidated by playing a traditional musical instrument to become part of a music creation process and collaborate in real time with other players at the table, forming a sort of impromptu band of remix DJs. The promotional video for the project states that the resulting music “isn’t a static product, but a living organism.” I haven’t been able to find any technical design details, but the project illustrates an intriguing direction for generative performance systems in a live group setting.

Quarta – an iOS game with a generative music soundtrack

Quarta screenshot

Screenshot of the Quarta game

Inspiration of the week discovered while conducting research for my final paper in the Digital Cultures course – the iOS game “Quarta” created by Peter Chilvers and Brett Gilbert.

Some intriguing aspects of this game: it’s easy to learn quickly, features a simple, intuitive interface with a clean visual design, and it’s quite addictive and replayable – you don’t get bored with it after a few matches. It also provides a soothing, ambient soundtrack that is generated based the number and spatial arrangement of the black and white pieces on the board (and possibly the color of the various areas as well, though I haven’t yet noticed a correlation there), making it an interested hybrid of a game and a musical instrument. Chilvers has a background in game development and generative music systems, having worked on projects such as the original Creatures release and the Will Wright game Spore, for which Chilvers collaborated with musician Brian Eno to create an ever-changing generative soundtrack. Chilvers and Eno have also created the generative music system iOS apps Bloom and Scape – more information about all of their app collaborations is available at  In my opinion, Quarta is a reminder of how engaging a simple and well-executed game concept can be.

Arduino/MicroView sound file controller & looper

A recent challenge in the EDPX 4010 course was to connect an Arduino device via a serial port to control a p5.js sketch. In this case, we’re working with the Arduino-compatible MicroView module that is included in this SparkFun Inventor’s Kit.  I wanted to explore the p5 sound library further, so I made a simple device that controls the playback speed of an audio file (between 0 – 3x) with a potentiometer, and can also loop a chosen section of the audio file using pushbutton controls.

Pressing the black button sets the start point of the looped segment, and the red button sets the end point and begins the looped playback of that segment. Pressing the red button again will set another end point in the loop and shorten the looped segment even more, and the black button will stop the looping and continue the playback normally. The MicroView screen displays the playback speed of the audio and the status of the black and red buttons. The p5 screen (above right) displays the current playback rate, whether looping is on or off (true or false), the status of the pushbuttons, the start (cued) time of the loop, and the current time of the audio file’s playback. The size of the yellow circle changes based on the playback rate. The p5 source code for the project is available here, and the MicroView/Arduino source code is here.

For the serial port connection, I used the p5.serialport library, and also the p5.serialcontrol GUI application to perform the actual serial communication, since JavaScript in a browser can not interact directly with a serial port. To run this sketch, you must first open the serialcontrol application and then run the p5 sketch. Basically, the MicroView is sending three values as a comma-separated text string through the serial port: the “digitalRead” state of the two buttons (0 or 1), and the “analogRead” value of the potentiometer, mapped to 0-255. The p5 sketch receives this text and parses the values with the split command, separating by commas. The sketch then uses those values to affect the playback speed and looping parameters. It also contains some logic checks to prevent repeated triggering if a button is held down, so that a held push is registered as only one push and is not continually changing values (this technique is known as “state change” or “edge” detection).

Some glitches with the p5.sound library – before the playback of a loop begins, the library first stops the playing state, sets the loop cue times, and then restarts playing, which creates an audible short pause in the process. Also, I initially had the potentiometer control the direction as well as the speed, so that the audio could be played in reverse. However, the library seems to reset the playback point to the beginning of the file before it begins the reverse playback, so the forwards/backwards control does not sound seemless, always starting from the same point in the file. I’m interested in digging further into the code of the library itself to see if I can change that behavior.

400 robot heads

Assignment for 4010 course: create a grid of robot heads, 20×20, with four variations shifting between rows or columns. The center four should “make a robot sound when clicked”. If you click on the center four figures in this sketch, you’ll hear a random quote spoken in synthesized speech, via the p5.speech library.

The single eye of each head also follows the mouse location, utilizing p5’s “constrain” function. Source code available here. The quotes were selected from this collection.

3D sound possibilities

As spotted recently on the “prosthetic knowledge” tumblr site – the Holographic Whisper three-dimensional spatial audio speaker system. (The slightly-over-the-top-futuristic-tech-style promotional video is included below…)

The creators propose a “sound-point method” that enables control of “aerial audio distributions more flexibly and more precisely in comparison with conventional superdirectional (sound-beam) loudspeakers. This method can generate and vanish the sound sources freely in air. These point sound sources can deliver private messages or music to individuals.” Unfortunately, there is no clear link to the mentioned research paper, and it doesn’t look like a prototype has been developed at this point. But it certainly warrants further exploration – I’ve been intrigued for awhile with the idea of creating a sonic installation in a space that could record the voices of attendees, and then play back segments of those recordings to future attendees with the audio being targeted (to be heard) at the same spatial location that the voices were recorded…a sonically “haunted” room filled with the voices of ghosts from past visitors.

Perlin noise “sound sphere” sketch

A sketch playing with the noise function of p5 and the sound library. The vertical positioning of each sphere, as well as the frequency of its oscillator, is shifted by stepping through the Y value of a noise sequence. The Z value of the noise sequence affects the diameter of the sphere and the amplitude of its oscillator.

Add the initial “soundsphere” by pressing the right arrow. Use the left & right arrow keys to delete or add more spheres. (You can add up to 20.)
Use the up and down arrows to adjust the speed of the step rate through the noise value.
Use the ‘a’ and ‘s’ keys to move the selector left or right (selector is displayed at the bottom).
Use the spacebar to change the wave type of the selected sphere (wave type will be displayed at the top right).
You can change the frequency range of the oscillator wave for the selected sphere using the following keys:
1 = Decrease lower limit of range by 50 Hz
2 = Increase lower limit of range by 50 Hz
3 = Decrease lower limit of range by 200 Hz
4 = Increase lower limit of range by 200 Hz
7 = Decrease lower limit of range by 200 Hz
8 = Increase lower limit of range by 200 Hz
9 = Decrease lower limit of range by 50 Hz
0 = Increase lower limit of range by 50 Hz
The lowest frequency limit is 60 Hz and the highest is 15 kHz.

The sketch will probably run more smoothly if you view it on its own page here. Source code available here and is commented in detail. One issue I’m noticing for future investigation – some audio “clicking” distortion occurs in certain frequency ranges.

Further exploration of arrays, objects and “scribbles” in p5

My latest experiment sketch with p5.js involving work with arrays. You will probably need to click on the image to activate it. Use the “t”,”s” and “c” keys on your keyboard to add a triangle, square or circle to the scene (size and color are randomized).

The sketch can be viewed on a separate page here (which will probably perform at a faster frame rate), and the p5 source code is available here. Each shape is added as an independent object to the master shape array. The number of elements currently in the master array is displayed at the top left. As the shapes fall and are shuttled off to the left or right (again, a random choice), they are “spliced” from the array after they leave the screen, hence why the count goes down.
This sketch also makes use of the p5.scribble library, which gives the shapes their jagged, “sketchy” appearance. If you un-comment the “randomSeed” statement in line 15 of the code, this will stop the animation of the jagged-ness, since the randomization used for that effect (in the p5.scribble code) is then “seeded” continually with the same number. (This number could be anything…not just “98”.)

At what risk?

A slight diversion of focus for this post…this past Friday (Oct. 21st), there were large distributed denial-of-service attacks (targeted at servers maintained by the company Dyn) which affected many major sites, including Netflix and Twitter. It appears that thousands of the DDoS sources included “internet of things” devices like webcams…and some of those are now being recalled:
Webcams used to attack Reddit and Twitter recalled –
The Chinese electronics manufacturer Hangzhou Xiongmai stated that many of their cameras could be easily hacked since users didn’t bother to change the default password on their devices. A bigger issue is that some devices don’t even allow users to change a default password. The BBC article states “Security costs money and electronics firms want to make their IoT device as cheap as possible. Paying developers to write secure code might mean a gadget is late to market and is more expensive. Plus enforcing good security on these devices can make them harder to use – again that might hit sales.”

…So, they get a chance of a slightly greater profit margin at the risk of a massive cyberattack that knocks out hugely popular websites used daily by millions of people? And they risk the enormous expense of needing to recall and upgrade their devices after such an attack occurs? Hmm…lesson learned??

Touchscreen pong with p5

…so it’s not exactly an “extreme” or “ultimate” game of pong, but the challenge led me to try some previously unexplored functions in p5.js. This version currently works only with touch screens and is sized specifically for an iPad air:  (p5 source code is available here)
p5 pong game screenshot

Particularly interesting elements of developing the game for me were:

  • Coding the scoring system. Points are gained by each player when the small light blue marble “ball” hits one of the 3 larger marbles in the center (based on the direction of the ball). The 2 outer targets are worth 10 points, the middle is worth 20. The numbers display the accumulated score for each player; however, if a player misses the ball with their paddle, they lose all of the points that they scored during that round (their score resets to that of the previous round). The target hit detection makes use of p5’s dist() function.
  • Using the oscillators and envelopes in the p5.sound library. This example was particularly helpful in getting started with sound generation, though I by-passed the MIDI note conversion and am providing the oscillator with a specific frequency value. Also, to prevent an audible note from immediately playing at the start of the game, the oscillator is creating a super-low frequency of 1 Hz until a target is hit and a different tone is played…a bit of a hack until I come up with a better solution.
  • Use of the built-in touches array to detect fingers on a touch screen. I found the short example sketch listed in the first response from lmccart on this page to be quite useful in figuring out how to track (and limit the number of) touches on a screen. The paddles and ball won’t move unless at least one finger is on either side.
  • Changes in velocity depending on which part of the paddle hits the ball. Basically using a combination of the map() function and some hit-or-miss experimentation with calculations for this – the ball will move slower and more vertically straight when it hits the center of a paddle, and the angle and speed increases greatly towards the edges.
« Older posts Newer posts »

© 2019 Noises & Signals

Theme by Anders NorenUp ↑