Page 1 of 2
. anyone playing with the kinect ?
Posted: Tue Nov 30, 2010 6:24 am
by anomad
. so, after seeing the open source drivers released i picked one up to play around w/(and to work on my dance moves..

)
. in following the various projects, the tuiokinect (
http://code.google.com/p/tuiokinect ) caught my eye as a start to a performance controller.
. so, i loaded it up on a mac mini and sent the video into a capture card on my macpro running modul8. i had to move the input screens around for each of the three layers -
. layer 1 - depth gradient
. layer 2 - just 'active' hands w/in a certain range
. layer 3 - the grey-scale webcam capture
. (but, i didn't crop any layers so i think the effects i used suffered b/c of that)
. overall, i'm pleased w/my initial tests -
http://vimeo.com/17314948
. i can see some potential for interaction by mapping depth information to MIDI or OSC messages - or an intermediate 'plugin' that allows you to modify the sensitivity of the depth sensor and webcam data as independent video inputs...
. anyone else playing around w/the kinect ?
-james
(a nomad. )
Posted: Tue Nov 30, 2010 3:33 pm
by The Midi Thief
I've been following the development of the Opensource Kinect drivers and what people do with them pretty closely. But I decided to wait until I either saw something really interesting or if I came up with a really interesting experiment I want to do. None of the two has happened to me so far.
Could you tell me a little bit about how it works? Do you get one video screen split in four with different depth maps in each 1/4? I'm a bit worried about lag if you use it for tracking. How is the Kinect connecting to the computer, btw? USB?
Posted: Wed Dec 01, 2010 4:52 am
by anomad
. i have the kinect for the original xbox, so it's usb 2.0 and needs it's own power supply.
. afaik, it sends two data streams, one is 640x480 color webcam feed (bayer filter) the other stream is depth data (which is often converted to grey scale or color) - from what i gather about the depth stream, it can be 'fine tuned' to be more or less precise.
. here's a screen cap from cocakinect i took -
http://www.npav.com/firstKinect.jpg - a very basic program someone uploaded. my thought is that these could easily be converted into their own video feed.. probably via macam or something like that.
. the tuiokinect app i downloaded (
http://code.google.com/p/tuiokinect/ ) splits it into 4 screens (depth, b/w cam, 'zone' and x/y of zone -- check out the video). so for this test, i was capturing that window into my machine running modul8 and moving/zooming each window into place.
. there isn't a huge amount of lag from my experiments - a lot of the sluggishness in the video is me not cropping the input video (so using a processor intensive effect like IscFlame1 - it was 'flaming' the entire video, even though I was zoomed in on one small part) and i was capturing the video from modul8 which is always a little stutter prone.
. i'm currently trying to get the code to compile (the author of tuiokinect graciously provides an Xcode project of the app) and reduce the windows to 320x240 for easier capture and add MIDI out to the x/y coordinates that could control the center of some effects. after getting more familiar w/TUIO libraries i might try some more interesting things
. i don't foresee a 'minority report' like vj interface, but i'm drawn to the depth sensing ( which is IR so it works in zero light conditions ) for all sorts of neat effects. adding a x/y/(z?) to a setup could be pretty sweet as well.... but it's been a while since i dug into anything xCode
-james
(a nomad. )
Posted: Thu Dec 02, 2010 6:19 am
by lotech
I got myself a Kinect but haven't had a chance to plug it into my laptop yet.
Your preliminary work certainly looks impressive for a thing thats only be available for a few weeks. It would be good if we could capture either or inputs direct into Modul8.
Keep posting your progress, and when I get a chance I will too.
Posted: Fri Dec 03, 2010 1:52 am
by The Midi Thief
Anomad, thanks for the nice report!
Posted: Fri Dec 24, 2010 10:29 pm
by roirrawtobor
Hi! Didn't feel like making a new thread for it..
But anybody has an idea how this guy does it:
??
http://www.youtube.com/watch?v=VscAvkdDUKk
Posted: Mon Dec 27, 2010 1:11 am
by anomad
. he mentions that he's using processing - the rest looks like simply x/y/z rotations and using a smooth (sine) wave to modify the size of the hexagons (probably the built in core video effect from os x 10.6)
-james
(a nomad. )
Posted: Mon Dec 27, 2010 1:23 am
by The Midi Thief
I've got the Kinect now too. I'm trying to figure out how to bring it in to Modul8. I'm not going through a second computer.
I haven't seen a way to get any of the views as normal video input. There should be a way to get it in to Modul8 through the QC Rehab Syphon trick. I might try that. At the moment I just want to send a hight contrast b/w depth map into M8.
Posted: Mon Dec 27, 2010 3:18 am
by The Midi Thief
To answer myself:
It was pretty fast and easy to get this to work.
1. Follow the instructions in this thread on how to get Syphon input in Modul8:
http://www.garagecube.com/forum/viewtop ... 2&start=15
2. Then I downloaded the latest version of Kinec Tools from Kineme (
http://kineme.net/release/KinectTools/03) and the Syphon plugin for Quartz Composer. After the installations I restarted and looked at two example files. You can just pipe the Depht Image from the Kineme Kinect Patch straight to the image input on the Syphon patch and voila!
3. To get the Syphon input in M8 you just need to drag the "Syphon Client QC.qtz" file (comes with the QC Syphon install) into the media library.
First I also tried getting the Kinect input via Syphon through an Openframeworks app (after reading this article:
http://palace-of-memory.net/kinect-open ... mspjitter/). That worked nice but there are a bunch of steps to follow in order to get the app compiled in XCode. But I do like the idea of having a small app running this in the background instead of having Quartz Composer running.
I don't think it's possible to have one QC file that solves it all. I tried importing the original Kinect QC file straight in to Modul8 but it wouldn't connect to the Kinect at all.
Posted: Fri Jan 21, 2011 4:23 am
by The Midi Thief
Some more progress on the Kinect front:
You can get the tracking values from TuioKinect and have OSCulator translate the TUIO data into OSC or MIDI. It's quite sketchy at the moment because I haven't figured out the best foreground/background settings for the depth map in TuioKinect. Let's say it ain't the Minority Report yet.
Some valuable info here:
http://www.osculator.net/forum/threads/ ... ght=kinect
The tricky part was setting it up right in OSCulator. I had to learn more about OSCulator than I really thought I had to. I might do a tutorial on this but I need to be able to make something pretty first or it will feel kind of useless.
I wish there was a software that let me get the Kinect camera output and have them either available as camera inputs or be able to send them via Syphon and have the tracking data translated to TUIO or OSC, all in the same app. It seems like only one app can be connected to the Kinect at the time.
Posted: Thu Feb 24, 2011 8:35 pm
by birdiedaboy
how did you got the macam to recognize the kinect?, i am trying to use de kinect as a webcam for skype, can anyone help!?
Posted: Fri Feb 25, 2011 5:24 am
by anomad
. @birdiedaboy -
. to the best of my knowledge, macam does not support the kinect at this time.
-james
(a nomad. )
Posted: Fri Feb 25, 2011 4:56 pm
by VjKg
I can get syphon demo to work. I can get the kinect running and sending data via osc and others. I can not get kinect to qc to syphon to modul8. Can anyone help with a simple qc project that feeds image from depth camera to modul8?
Posted: Fri Feb 25, 2011 5:05 pm
by The Midi Thief
First of all, did you install QC Rehab as specified in this thread?:
http://www.garagecube.com/forum/viewtop ... 2&start=15
Posted: Fri Feb 25, 2011 6:52 pm
by VjKg
I got syphon working great with modul8. Just have no idea how to use qc and can not route kinect. I can see any qc template I load. Can not route kinect to modul8.