Kinect + Quartz + Oscar Wilde

For my first in-class presentation this semester, I performed “My Profundis”, a little visual-sonic-textual-movement essay where my alterego SkyRon™ channels Oscar Wilde, while generating trippy visuals via Kinect. The video cube location and rotation is controlled by the left hand, and its dimension, along with the position of the fuzzy halo-balls (particle system), is controlled by the right hand. A live camera feeds visuals to the background and the cube, and is pointed at the screen or monitor, creating a very hazy style of video feedback. There’s also a patch I found online that motion-blurs the blurry video background according to the position of your body.

This is my first project using Kinect, which is running a patch I created in Quartz Composer (with a big assist from middleware Synapse, and a patch made in Max MSP). Currently, I can’t find a whole lot of documentation on Kinect-Quartz, other than the Synapse site. If you’re interested in a more detailed exploration of Kinect (with Processing, Arduino, MakerBot, etc.), I suggest the O’Reilly text Making Things See by Greg Borenstein, which uses the NI library to create interaction with Processing.

“My Profundis” is at a conceptual growth-point, and may take a number of different paths as it develops. I will have a greater idea about how to incorporate Kinect into performance once I further connect the sensor to other software, especially PureData.(sound design/composition) and Unity, and possibly Processing, once I overcome my Terminal (Unix) Anxiety!

What you need to recreate this (besides a Mac Book Pro running Lion, a separate video camera, and a Kinect sensor) is Synapse (downloads here) and my Quartz composition. Enjoy!