Info: In August of 2011 I was presented with an amazing opportunity. Possible Productions reached out to me regarding an upcoming project with Bionic LeagueDeadmau5′s 2011/2012 tour: Meowingtons Hax. Possible Productions or Possible is an art and design practice which aligns filmmaking, fine art, and computer science to tell amazing stories. They are based in Los Angeles.

Possible was put in charge of designing and creating the visual experience for Deadmau5′s show. I met up with Michael Figge and Alex Afzali from Possible and they described to me their vision for the tour’s visuals, and how they wanted their content to be as tightly synced to the show as possible. 

Here is where it gets interesting, there is usually a set song list for the show, however, Deadmau5 sometimes deviates from the set list and mixes on the fly, which essentially throws off Possible’s pre-rendered visual content. Possible told me they wanted all the visuals to sync perfectly to the music. So there was my challenge as the hacker / artist / designer / creative. The only problem was that they needed something in a week. (5 Days of working time, which is insanely short for production software…).

I don’t think I was truly ready for what was bestowed upon me, but I didn’t care and took on the project anyway. I love electronic music (abstract, experimental, glitch, dance, etc), its what drives me and inspires me.

After Figge and Alex gave me the low down on the challenge, I thought of a couple creative application solutions. After our discussions, I realized that their visuals were key in the show, so my role wasn’t to create new visuals, but to figure out how to affect Possible’s visuals to make them feel in sync with the show. So I decided to build an application that would output generative visuals, and these visuals would act as a mask for the pre-rendered content. In addition to the generative masks, Possible wanted the application to be midi-mappable, so if Deadmau5′s toggles a button or moves a fader up and down it would reflect that in the show’s visuals. Moreover my app had to pixel map the visuals so they displayed perfectly on Deadmau5′s stage setup, which was composed of two stealth LED wedges, an LED back wall, 9 small cubes (with LED panels on visible each side of the cubes) and the main cube (that had large LED panels on each visible side as well). 

I decided to develop the application in openframeworks, seeing as that seemed like the most appropriate for the job. Back in 2011 I was comfortable with OF, but I hadn’t created any of my [now] go to libraries/addons to help me accomplish this feat. So in that week of design and development, I had created the infant/pre-alpha versions of ofxUI, ofxLayers, ofxMidiMapper, ofxBeatDetector, ofxGenerative.

In short, the application, dubbed “Rezanator”, was composed of 5 visuals layers or compositions. Rezanator ran one layer at the time, thus Possible could switch the layers whenever they needed a different look.

Each layer generated a unique looking mask. The first layer, “Black Out”, was used for when the visuals needed to be muted or hidden completely, very useful for breakdowns before the drop. The second layer, “White Out” was a pass through layer (meaning all the pre-rendered visual content for the show would show on stage). With these two layers along, Possible could have midi mapped the layer UI buttons and toggled these on the fly to create a flashing effect on stage. The next layer, “Dynamic Mask”, produced simple squares and rectangles which oscillated in size. This layer had controls that allowed Possible to make the shapes oscillate at different rate and offset the phase of each shape to produce the effect shown in the image below and video (Deadmau5 + Rezanator). The size of the shapes could be modified as well, allowing for various different types of visual masking effects.

The last two layers, “Generative Mask” and “Audio Reactive” were the most interesting tho. For the “Generative” layer I created a generative piece composed particles arranged in concentric circles that would repel the mouse position on screen. Thus when the mouse moved within a certain radius of the particles, the particles would reactive by getting pushed in the z-direction (First image in the post and another directly below). The particles were simulated in real-time, thus Possible had control over their damping, homing force (pulling them back to their original position) and the repulsion force scalar (which determined how far the particles were pushed). These parameters were exposed via interface sliders and buttons and were midi-mappable thus allows for various different aesthetics and motions to be achieved in real-time!

The “Audio Reactive” layer was one of the most interesting to program because of how it mapped the visuals on to the cubes on stage. It was the most rewarding because of the visual effect achieved and its tight synchronization with the music. I wanted to make a square move from one face of the main cube to the next, and I wanted it to transition to the next face when a kick was detected. The end visual result made the square look like it was being pushed by the bass and projection mapped perfectly on to the cube’s surface (Check out the image below which hows the stage’s pixel map and how the square visuals mapped on to it). I wrote a beat detector, ofxBeatDetector (couldn’t have done it without this reference), and used that to trigger a pushing force on the square. The force was proportional to the loudness of the audio, thus the louder the kick/drop the further the square traveled. The square’s damping and force’s magnitude scalar was controllable via the UI thus the movement was tweak-able on the fly. This allowed for various different types of hypnotic / stroboscopic effects.

The last task was to integrate my app into Possible’s setup. I used syphon (via ofxSyphon) to send the visuals out so they could be mixed however Possible wanted to mix them. I also allowed for fullscreen mode thus allowing Possible to capture the output from the screen (i.e. using a Blackmagic or Matrox capture card). 

After a week of intensive coding and iterating I was happy with the results. I wrote a beat detector, various interface elements, a view / view controller system for openframeworks, midi mapper, and various visual masks. I learned a ton from this project. It was the first of many projects that drove me to develop ofxUI and various other addons for openframeworks to make my development and workflow easier and faster. Its been about two years and my tools have matured quite a bit by being used and extended on various other projects / platforms.

In the end, Possible went on to produce amazing visuals for Deadmau5′s tour.  I am happy that I got to be part of it, thank you Possible, Bionic League, Deadmau5, and VJ Fader (for connecting me with Possible). Here is a behind the scene’s video from Possible:

Agency: Possible Productions
Designer & Developer: Reza Ali      
Client: Deadmau5
Output: Custom Software (C++, openFrameworks)
Year: 2011