|
osceleton + rbg or depthhey, is it possible to put the rbg or depth image of the kinect under the ocsceleton? so it looks like the openNI sample. saw "itsthejayj" stuff but i have no clue how to manage that. and would it then be possible to manipulate a cylinder a bit like the image attached? i think i saw a osceleton manipulating metaballs?! _ 10.6.6
|
In order to view the rgb/depth/infrared image from the kinect and also get the OpenNI User Tracker / Skeletal Tracking, and derivative stuff (like OSCeleton) at the same time, there would have to be some sort of plugin that received the image info from one of the available drivers that is made to work with OpenNI, like Avin's fork for the Kinect driver, that integrates with OpenNI.
Right now, the only plugins that integrate Kinect into QC are Kineme's Kinect Tools, Vade's Beta Open Kinect, a project on github by someone that isn't really setup correctly (I can't remember the name of it), and some other project that attempts to use Cinder's libraries for kinect in the context of QCPlugin, which also doesn't work yet. All of those use libfreenect...which isn't really bad, but it does preclude it from working well with the OpenNI stuff.
When one of the drivers (at least on my system) is working with the kinect, it seems to prevent the other driver from accessing it as well, as written now at least.
In the context of Open NI, you can have multiple apps going at the same time. So, you can have the image stuff happening, the skeletal tracker, the hand tracker, and there is no conflict with it all working simultaneously, except for the increased toll on your computer from running the multiple processes.
So, I think the answer is no right now... unless you wanted to open up the OpenNI viewer, osceleton and use v002 screen capture the OpenNI screen to composite within QC, so that everything was sort of happening within OpenNI land. That would be kind of crappy though, I think. Running two machines and using remote desktop would probably also suck.
It would probably be most productive to look at making a provider plugin that uses the sensor kinect driver so that there wouldn't be this conflict.
just use another camera and put it as close to the kinect rgb cam as possible. like if you have a macbook prop up the kinect behind the lid so it is right on top of the i-sight then nudge the i-sight video to line up with your skeleton then crop so everything looks clean.
?
That gets you an rgb image, but what about the depth channel?
When you use something like OSCeleton (eg., anything that uses OpenNI), it doesn't prevent something the libfreenect driver from accessing the depth image (and/or rgb/ir) for you? It does for me (I may have wound up unlucky or something!)
thanks for the detailed answer, gtoledo3! thought about the 2nd camera - but would be cooler to get a 3d reality image with the osceleton - and also to adjust the whole thing when the kinect is already recording an image... I thought there would be an easier option.
But I gonna fake it for now_
i was under the impression that he wanted to line the skeleton up like the itsthejay tryplex video. sort of augmented where it was a video of him hitting virtual objects. yeah i don't think its possible to use libfree and nite at the same time. it is possible to get the depth image and or rgb image from nite though while skeleton tracking. i am doing this in unity.