Rules and Play

So I’ve been hanging with my local game dev group “run, jump, dev”, they’re wonderful, and we recently finished a run of this big collaborative interactive art project called “Rules and Play”. We were basically told with a 3 month lead time that we would have two rooms in an art gallery for two months, ready, set, go. So after some experimentation, about five of us each came up with an exhibit and slowly began to merge them all into one, cohesive thing.

For my part, in short, I used a Kinect, a computer, a homemade button box, some speakers and a projector to create a series of projection-mapped scenes that would react, reflect, and interact with the user both visually and aurally. It’s kinda easier to show, here are two examples:

That version actually doesn’t have the button box. Instead, I thresholded the mic input on the kinect to trigger some visual effects.

Most of the magic behind all this was a program/language called Touchdesigner. If you’ve never used it, it’s a visual dataflow-like language and IDE, similar to Labview, in a less, uh, awful way. You basically have blocks representing your inputs, e.g. the kinect camera, ir sensor, player index, midi, microphone, etc. Then you use a massive library of native operators to manipulate all that data until you send it to an output at the end. It’s very modular, intuitive, and has completely changed my opinion of visual programming, and really how I think about GUIs in general. I can’t recommend it highly enough. It also has an interface to ableton, another really progressive application for music development you are probably familiar with. Combined with ableton’s native support for Max, and you have a very cohesive, complete, and really quite elegant suite for creating some incredible audio-visual experiences.

So after gathering some music from local artists, pictures from around town, and some animations from a digital animator in run-jump-dev, I smashed it all together into a handful of scenes, did some crazy operations from 3d pixelizations, to cutouts, until I had a library of my own operators being switched in and out on these scenes, to a level of complexity where no one persons experience was the same as anothers.

And on top of all this goodness, another member of our group developed a server for receiving and sending midi data to any device subscribed in the exhibit. So after creating some midi boxes with buttons, sliders and knobs, users in one side of the room were able to trigger events all over the exhibit. This last bit was really all thrown together over the course of the 3 days we had to install everything.

All told, this experience was intensely stressful, terrifying at times, and the most fun I’ve ever had developing. After the initial 2 month showing, people liked it so much, we got two more exhibits at the Pam Miller gallery, and then at the studio 300 collaborative art event which really deserves its own post. We discovered all these amazing tools over a very short period of time, but I think we really only just scratched the surface of what’s really possible. Can’t wait for next year.