Noises & Signals

Contemplations on creativity in our digital age

p5 EDP logo project

edp_p5_screenshot_smThis sketch was created as a final project for the EDPX 4010 Emergent Digital Tools course . The assignment was to create a visual generative piece using p5.js that somehow incorporates the EDP program logo. The project will then be displayed on a large LCD screen in the program’s office as part of a collection of digital art works.

After experimenting (unsuccessfully) with other ideas based on inspiring sketches found on OpenProcessing, I took an approach for this project that would run over longer periods of time at a reliable frame rate, avoiding the speed limitations that I’ve encountered using p5 & JavaScript for more complex sketches. Each single letter of edp is controlled by noise-influenced parameters affecting their position, scale, hue and brightness. The descriptive text in the background is “painted over” by the larger letters, and will fade and shift position after about a minute. To achieve a clean trailing effect, rather than using a transparent background which causes a visual artifact of “ghost trail” images, I used a suggested technique of initially capturing a close-to-black image of the empty canvas, and then blending that image to each new frame using the DIFFERENCE (subtractive) blending mode.

The project can be viewed in its full 1920×1080 dimensions here, and the p5 JS source code can be viewed here. The target display that this will be running on is rotated 90 degrees clockwise, hence why this sketch appears sideways here.

The Unbearable Slowness of p5-ing

evol1…well, “unbearable” might be a stretch. But when compared to running native Processing sketches in Java, the difference is certainly noticeable. While experimenting with potential approaches for the final generative art project in EDPX 4010, I was very inspired by Asher Salomon’s “Evolution” sketch in OpenProcessing: https://www.openprocessing.org/sketch/15839, which created a very cool watercolor-esque painting effect. My initial attempts at porting Salomon’s code to p5/JavaScript resulted in an abysmally slow rendering rate…nothing even close to the original sketch. After a lot of further exploration and changes, I finally reached a more respectable speed, though not something that could be used in a large canvas of 1920×1080. My p5 results can be viewed here, and the source code is here. Some important discoveries:

1 – Instead of the get() and set() methods used in the original (which are still available in p5/JS), getting and setting pixel colors via the pixels[] array is faster. loadPixels() and updatePixels() need to be included for this approach to work. Dan Schiffman’s “Pixel Array” tutorial video was incredibly helpful in understanding this process – in particular, setting the pixelDensity value to 1 for my Mac’s retina display was a trick that I did not find documented elsewhere.

2 – Instead of using the floor() and constrain() methods in p5, using the native JavaScript Math.floor, Math.min & Math.max object methods performed faster by a substantial margin!

There are probably more optimization tweaks to be made to the p5 version – to be continued…

 

“We Become What We Behold”

I’ve been a fan of Nicky Case’s work for awhile, especially his Parable of the Polygons and Neurotic Neurons projects. His most recent work, “We Become What We Behold“, is the latest addition to his thoughtful and entertaining online collection.

The “game” is described as being about “news cycles, vicious cycles, infinite cycles.” To me, it’s an inspiring example of intentional…I might even say “activist”…web-based art, especially given the outcome of the recent national election and thinking about the media’s role in feeding fears and influencing voters’ choices. I initially felt that Case was being too extreme in the portrayal of media sensationalism, but his blog post about this project sheds a bit more light on where he’s coming from. He describes how his fellowship with the PBS program Frontline provided him with a deeper perspective on the lure of “clickbait” and how journalists struggle with it – the sensationalist stories end up getting the most attention from the general public. Even if you don’t agree with his viewpoint about “the media”, the experience of the game provides an interesting catalyst for conversations about the the effect of these cycles on our society and culture. It’s also great that Case has made the code for this (and his other projects) openly available for other developers to play with and freely remix.

The “Audience Reactive Composition”

The Audience Reactive CompositionEarlier this year at the SXSW Interactive festival, the design and development agency Deloitte Digital presented an immersive generative music installation called the “Audience Reactive Composition” (ARC). Created by the Dave & Gabe Interactive Installation Studio, the project was described in the online magazine The Verve as something “you would see at a Daft Punk concert fifteen years for now, with Tron-like neon lights and all manner of rotating spheres and illuminating touch-sensitive cylinders”. Users can play the ARC’s music through physical interaction with its five sculpted instrument interfaces equipped with touch sensors, large glowing trackballs and hand-sized flat joysticks. By manipulating these instruments, players change the rhythm, melody, chords and sonic intensity of various elements in an algorithmic-based piece of music created by the electronic musician known as RAC (André Allen Anjos). The result is played back over a system of 20 speakers that encircle the central instrument station. Synchronized light animations also appear in the surrounding environment in response to the resulting music.

The installation was created with the intention of inviting people who might normally be intimidated by playing a traditional musical instrument to become part of a music creation process and collaborate in real time with other players at the table, forming a sort of impromptu band of remix DJs. The promotional video for the project states that the resulting music “isn’t a static product, but a living organism.” I haven’t been able to find any technical design details, but the project illustrates an intriguing direction for generative performance systems in a live group setting.

Quarta – an iOS game with a generative music soundtrack

Quarta screenshot

Screenshot of the Quarta game

Inspiration of the week discovered while conducting research for my final paper in the Digital Cultures course – the iOS game “Quarta” created by Peter Chilvers and Brett Gilbert.

Some intriguing aspects of this game: it’s easy to learn quickly, features a simple, intuitive interface with a clean visual design, and it’s quite addictive and replayable – you don’t get bored with it after a few matches. It also provides a soothing, ambient soundtrack that is generated based the number and spatial arrangement of the black and white pieces on the board (and possibly the color of the various areas as well, though I haven’t yet noticed a correlation there), making it an interested hybrid of a game and a musical instrument. Chilvers has a background in game development and generative music systems, having worked on projects such as the original Creatures release and the Will Wright game Spore, for which Chilvers collaborated with musician Brian Eno to create an ever-changing generative soundtrack. Chilvers and Eno have also created the generative music system iOS apps Bloom and Scape – more information about all of their app collaborations is available at www.generativemusic.com.  In my opinion, Quarta is a reminder of how engaging a simple and well-executed game concept can be.

Arduino/MicroView sound file controller & looper

A recent challenge in the EDPX 4010 course was to connect an Arduino device via a serial port to control a p5.js sketch. In this case, we’re working with the Arduino-compatible MicroView module that is included in this SparkFun Inventor’s Kit.  I wanted to explore the p5 sound library further, so I made a simple device that controls the playback speed of an audio file (between 0 – 3x) with a potentiometer, and can also loop a chosen section of the audio file using pushbutton controls.

microview_project_smp5_arudino_screenshot
Pressing the black button sets the start point of the looped segment, and the red button sets the end point and begins the looped playback of that segment. Pressing the red button again will set another end point in the loop and shorten the looped segment even more, and the black button will stop the looping and continue the playback normally. The MicroView screen displays the playback speed of the audio and the status of the black and red buttons. The p5 screen (above right) displays the current playback rate, whether looping is on or off (true or false), the status of the pushbuttons, the start (cued) time of the loop, and the current time of the audio file’s playback. The size of the yellow circle changes based on the playback rate. The p5 source code for the project is available here, and the MicroView/Arduino source code is here.

For the serial port connection, I used the p5.serialport library, and also the p5.serialcontrol GUI application to perform the actual serial communication, since JavaScript in a browser can not interact directly with a serial port. To run this sketch, you must first open the serialcontrol application and then run the p5 sketch. Basically, the MicroView is sending three values as a comma-separated text string through the serial port: the “digitalRead” state of the two buttons (0 or 1), and the “analogRead” value of the potentiometer, mapped to 0-255. The p5 sketch receives this text and parses the values with the split command, separating by commas. The sketch then uses those values to affect the playback speed and looping parameters. It also contains some logic checks to prevent repeated triggering if a button is held down, so that a held push is registered as only one push and is not continually changing values (this technique is known as “state change” or “edge” detection).

Some glitches with the p5.sound library – before the playback of a loop begins, the library first stops the playing state, sets the loop cue times, and then restarts playing, which creates an audible short pause in the process. Also, I initially had the potentiometer control the direction as well as the speed, so that the audio could be played in reverse. However, the library seems to reset the playback point to the beginning of the file before it begins the reverse playback, so the forwards/backwards control does not sound seemless, always starting from the same point in the file. I’m interested in digging further into the code of the library itself to see if I can change that behavior.

400 robot heads

Assignment for 4010 course: create a grid of robot heads, 20×20, with four variations shifting between rows or columns. The center four should “make a robot sound when clicked”. If you click on the center four figures in this sketch, you’ll hear a random quote spoken in synthesized speech, via the p5.speech library.

The single eye of each head also follows the mouse location, utilizing p5’s “constrain” function. Source code available here. The quotes were selected from this collection.

3D sound possibilities

As spotted recently on the “prosthetic knowledge” tumblr site – the Holographic Whisper three-dimensional spatial audio speaker system. (The slightly-over-the-top-futuristic-tech-style promotional video is included below…)

The creators propose a “sound-point method” that enables control of “aerial audio distributions more flexibly and more precisely in comparison with conventional superdirectional (sound-beam) loudspeakers. This method can generate and vanish the sound sources freely in air. These point sound sources can deliver private messages or music to individuals.” Unfortunately, there is no clear link to the mentioned research paper, and it doesn’t look like a prototype has been developed at this point. But it certainly warrants further exploration – I’ve been intrigued for awhile with the idea of creating a sonic installation in a space that could record the voices of attendees, and then play back segments of those recordings to future attendees with the audio being targeted (to be heard) at the same spatial location that the voices were recorded…a sonically “haunted” room filled with the voices of ghosts from past visitors.

Perlin noise “sound sphere” sketch

A sketch playing with the noise function of p5 and the sound library. The vertical positioning of each sphere, as well as the frequency of its oscillator, is shifted by stepping through the Y value of a noise sequence. The Z value of the noise sequence affects the diameter of the sphere and the amplitude of its oscillator.

Instructions:
Add the initial “soundsphere” by pressing the right arrow. Use the left & right arrow keys to delete or add more spheres. (You can add up to 20.)
Use the up and down arrows to adjust the speed of the step rate through the noise value.
Use the ‘a’ and ‘s’ keys to move the selector left or right (selector is displayed at the bottom).
Use the spacebar to change the wave type of the selected sphere (wave type will be displayed at the top right).
You can change the frequency range of the oscillator wave for the selected sphere using the following keys:
1 = Decrease lower limit of range by 50 Hz
2 = Increase lower limit of range by 50 Hz
3 = Decrease lower limit of range by 200 Hz
4 = Increase lower limit of range by 200 Hz
7 = Decrease lower limit of range by 200 Hz
8 = Increase lower limit of range by 200 Hz
9 = Decrease lower limit of range by 50 Hz
0 = Increase lower limit of range by 50 Hz
The lowest frequency limit is 60 Hz and the highest is 15 kHz.

The sketch will probably run more smoothly if you view it on its own page here. Source code available here and is commented in detail. One issue I’m noticing for future investigation – some audio “clicking” distortion occurs in certain frequency ranges.

Further exploration of arrays, objects and “scribbles” in p5

My latest experiment sketch with p5.js involving work with arrays. You will probably need to click on the image to activate it. Use the “t”,”s” and “c” keys on your keyboard to add a triangle, square or circle to the scene (size and color are randomized).

The sketch can be viewed on a separate page here (which will probably perform at a faster frame rate), and the p5 source code is available here. Each shape is added as an independent object to the master shape array. The number of elements currently in the master array is displayed at the top left. As the shapes fall and are shuttled off to the left or right (again, a random choice), they are “spliced” from the array after they leave the screen, hence why the count goes down.
This sketch also makes use of the p5.scribble library, which gives the shapes their jagged, “sketchy” appearance. If you un-comment the “randomSeed” statement in line 15 of the code, this will stop the animation of the jagged-ness, since the randomization used for that effect (in the p5.scribble code) is then “seeded” continually with the same number. (This number could be anything…not just “98”.)

« Older posts