Pd via Kinect: SkyRon’s QuadTheremin and Sampler

Yeah, I know—straight out of 2011, but some of us have been busy with other distractions. So, above is a preview of a set of QuadTheremin essays, and below is SkyRon™’s initial performance , assisted by Pooster Beest, of Kick-Ass Kitty fame (view that video here).
And, yes, it’s not really four independent channels of theremin wonderfulness—more like two dual-pitch stereo channels. Both x- and y- axes  control different pitches on each hand, with the z-axis controlling dynamic/volume. I need to adjust the Pd-Kinect router patch in order to make the four channels truly independent.
I suspect this will be a useful start for approaching Kinect control of sound, especially if one keeps it simple—sure, we could have wahwah or vocoder effects added to each foot, but I’m not that good of a dancer! Hands are easier, plus we’ve all studied conducting anyway, right?
For a quick overview of how to build this, go here. Oh, and, ok, here is the first draft of a sample-player controlled in PureData via Kinect—enjoy!

Kinect + Quartz + Oscar Wilde

For my first in-class presentation this semester, I performed “My Profundis”, a little visual-sonic-textual-movement essay where my alterego SkyRon™ channels Oscar Wilde, while generating trippy visuals via Kinect. The video cube location and rotation is controlled by the left hand, and its dimension, along with the position of the fuzzy halo-balls (particle system), is controlled by the right hand. A live camera feeds visuals to the background and the cube, and is pointed at the screen or monitor, creating a very hazy style of video feedback. There’s also a patch I found online that motion-blurs the blurry video background according to the position of your body.

This is my first project using Kinect, which is running a patch I created in Quartz Composer (with a big assist from middleware Synapse, and a patch made in Max MSP). Currently, I can’t find a whole lot of documentation on Kinect-Quartz, other than the Synapse site. If you’re interested in a more detailed exploration of Kinect (with Processing, Arduino, MakerBot, etc.), I suggest the O’Reilly text Making Things See by Greg Borenstein, which uses the NI library to create interaction with Processing.

“My Profundis” is at a conceptual growth-point, and may take a number of different paths as it develops. I will have a greater idea about how to incorporate Kinect into performance once I further connect the sensor to other software, especially PureData.(sound design/composition) and Unity, and possibly Processing, once I overcome my Terminal (Unix) Anxiety!

What you need to recreate this (besides a Mac Book Pro running Lion, a separate video camera, and a Kinect sensor) is Synapse (downloads here) and my Quartz composition. Enjoy!