OK, so I must state how much I love and hate Quartz Composer. I love the power and options. I hate the way it never quite works for me. I used it for the above demo and using Synapse for skeleton tracking. I was able to get my latency issues greatly reduced using this method. My issue is that Modul8 just shows my qtz as a white image. I worked on this one issue for 2 hours. With much help from other tutorials I got the tracking implemented in under 30 minutes.
Is this one of those unsafe patch issues? Did I not turn some obscure thing on in the end of the chain? Can I send the OSC data from the Kinect -> Synapse -> Quartz Composer directly to Modul8 for tracking a layer?
With my original idea, even with the latency, setting the min and max range made everything in the zone I picked get light. With the new one a person must stay in frame at all times and if the leave for to long they must recalibrate (see image below). This might not be the best thing for a performer on stage.
I am working my dilemma from all ends testing what works for me but does anybody see something I do not in my setup?