|
kinect augmented realityso i planned on just drawing something on the screen with my kinect turns out this augmentation was way more interesting than any ir tag method i have tried before. there is this very specific distance from the kinect where the tracking is just phenomenal. i'm dubbing it the sweet spot. so the accuracy here is a bit offset as i'm using my i-sight for augmentation.... it is slightly below the kinect so the perspectives are not completely lined up but for this initial unintended experiment it works. going to tighten it up a bit. actually works better than any ir tag methods i have tried in the past.
|
Wow ! I'm so impatient to begin to test this Kinect !
It seems that the Kinect must be at 1,80 meters max from the protagonist. Did you test it further ? It would be a condition for it to be used in performance shows.
I'm asking myself if I begin to change my whole IR tracking system ! :-/
Hiya dust --
That didn't take long :-)
I'm working on interactive signage, and have built a few interesting things from your HaarKit facial recognizer - we discussed it here: http://kineme.net/forum/Applications/Compositions/Headtrackingfun
The Kinect really opens up the possibilities. It's obvious some brilliant things will be built in this forum as the tools mature.
Maybe you can share what you are planning, or what you'd like to see being created on this platform?
All the best -R
this xbox thing is sick. i'm uploading a new one to vimeo right now. kinect is blowing my minds eye check this screen shot. i'm testing multi-touching the teapot making it pour touching it with all my fingers and all of a sudden the augmented reality kicked in when actually grabbed this thing that wasn't really even there. its messing with my head right now i got to go to sleep will experiment more later. my eye the computer eye one of them is playing tricks on me.
double post ;P)
Amazing stuff! I think I need to buy one of these things....
a|x
im pretty happy i got one.
This is freaking me out! Are you using the depth output from kineme kinect patch to do this? I have to admit it would take me years to do what you are doing, on my own, so if you´d care to shed some light on how you are getting the xyzs?
B
i used openframeworks and opencv to track blobs at different depths. i then send blob data to qc via osc to render.
Has anyone done blob tracking natively in QC -- maybe for a multitouch project?
Blob analysis is in the openCV libraries. I've asked around, and the Processing guys point to OpenFrameworks for the most mature implementation.
Crazy!
My first attempt, tracking goes so nice with kinect!