So, SkyRon™ has been buggin’ me for a while, to put a more-or-less definitive version of his performance on Kombo™ (quad-theremin/sample player) online. Here it is!
This can be understood in terms of the little paper I wrote a while ago, which has astonishingly, to date, garnered more than 1,600 views!
Finally, I’ve self-published a modest paper on my approach to using Kinect in live performance. Read or download it here.
It happened. Some people got hurt, bad. Some may have even wandered home afterwards and hung themselves from silk stockings or rough rope they bought at Home Depot, despondent over the current state of the arts . . . .
(of course, I jest . . . . )
meme™ —media experimental ensemble™—provided a momentary distraction from that state at the FAU Faculty Biennial exhibition last Friday (September 20, 2013) on the Boca campus. It was a well attended event, and much curiosity was aroused by a certain crazed mad scientist in black conducting his torso-controlled remix of Strauss’s Also Sprach Zarathustra ( seen in video above the final scrolling credits).
Additional Quartz compositions, videojam sets, and film remixes of the work of Ariel Baron-Robbins were provided by Anna Torlen, James Ko, David Wichinsky, Rosena Francois, and August DeWinkler.
Yeah, I know—straight out of 2011, but some of us have been busy with other distractions. So, above is a preview of a set of QuadTheremin essays, and below is SkyRon™’s initial performance , assisted by Pooster Beest, of Kick-Ass Kitty fame (view that
And, yes, it’s not really four independent channels of theremin wonderfulness—more like two dual-pitch stereo channels. Both x- and y- axes control different pitches on each hand, with the z-axis controlling dynamic/volume. I need to adjust the Pd-Kinect router patch in order to make the four channels truly independent.
I suspect this will be a useful start for approaching Kinect control of sound, especially if one keeps it simple—sure, we could have wahwah or vocoder effects added to each foot, but I’m not that good of a dancer! Hands are easier, plus we’ve all studied conducting anyway, right?
For a quick overview of how to build this, go here
. Oh, and, ok, here
is the first draft of a sample-player controlled in PureData via Kinect—enjoy!
For my first in-class presentation this semester, I performed “My Profundis”, a little visual-sonic-textual-movement essay where my alterego SkyRon™ channels Oscar Wilde, while generating trippy visuals via Kinect. The video cube location and rotation is controlled by the left hand, and its dimension, along with the position of the fuzzy halo-balls (particle system), is controlled by the right hand. A live camera feeds visuals to the background and the cube, and is pointed at the screen or monitor, creating a very hazy style of video feedback. There’s also a patch I found online that motion-blurs the blurry video background according to the position of your body.
This is my first project using Kinect, which is running a patch I created in Quartz Composer (with a big assist from middleware Synapse, and a patch made in Max MSP). Currently, I can’t find a whole lot of documentation on Kinect-Quartz, other than the Synapse site. If you’re interested in a more detailed exploration of Kinect (with Processing, Arduino, MakerBot, etc.), I suggest the O’Reilly text Making Things See by Greg Borenstein, which uses the NI library to create interaction with Processing.
“My Profundis” is at a conceptual growth-point, and may take a number of different paths as it develops. I will have a greater idea about how to incorporate Kinect into performance once I further connect the sensor to other software, especially PureData.(sound design/composition) and Unity, and possibly Processing, once I overcome my Terminal (Unix) Anxiety!
What you need to recreate this (besides a Mac Book Pro running Lion, a separate video camera, and a Kinect sensor) is Synapse (downloads here) and my Quartz composition. Enjoy!