Animations in fbx format supported by Kineme3D?

photonal's picture

I notice that Kineme3D supports animations in the md2 format.

I haven't come across any programmes which can generate these files for OSX. However, various 3D modelling prgrammes can create animations and save them in the fbx format, but Kineme3D doesn't seem to support them.

Can someone confirm this? Will Kineme3D support other 3D animation formats in the future perhaps?

cwright's picture
Re: Animations in fbx format supported by Kineme3D?

It supports animation in FBX IF the animation is a bunch of separate objects (just like MD2). It does Not support keyframing or skeletal animation from FBX yet (we're planning on supporting that in the future, it's just a lot of work to implement :/)

sbn..'s picture
Re: Animations in fbx format supported by Kineme3D?

I'm not a K3D user, but it puzzled me a bit when I read that the animation format it supports is MD2.

It's not a particularly nice or powerful format, and a bit quirky on top of that. It's really specced for optimizations and ways of doing animation that suits an older game engine. Still, I guess the Kineme team had good reason to choose it.

Have you seen this page? Blender should be able to export MD2.

cwright's picture
Re: Animations in fbx format supported by Kineme3D?

You're completely right, it's not a particularly good or powerful format (it's very dated, too). However, writing a parser for MD2 files was extremely easy, and having some basic animation capabilities was better than nothing.

FBX's animation stuff is much more sophisticated, but with that sophistication comes much more complex code for parsing and animating, and a much more complex interface. It's on the roadmap, but we don't have any estimates on it.

photonal's picture
Re: Animations in fbx format supported by Kineme3D?

Thanks for the feedback. If you guys get it done, would be really great!

sbn..'s picture
Re: Animations in fbx format supported by Kineme3D?

I hope you understand I didn't intend to rip on your efforts, cwright.

After reading about it, I see that MD2 is sort of the last GE format before real skeletal animation became a reality. Skeletal with envelope blending is, I assume, vastly more difficult.

What would be your commission to integrate the Blender game engine as a node in QC? I'm joking, of course, but could it be done?

cwright's picture
Re: Animations in fbx format supported by Kineme3D?

No offsense taken at all :)

After MD2, there was MD3, which was still keyframed like MD2, but with variable framerates, more detailed meshes, and segmented animations (so you could control walking legs independently of shooting arms or looking heads etc). It's essentially a variation on the MD2 theme, but much more complete (and much more complex, as the renderer needs to stitch together several animating mesh segments -- working that into Kineme3D's pipeline is difficult to say the least :)

After that, MD4 included skeletal animation, and it's been that way ever since (I'm not sure what .X and other file formats offered, but I'm guessing it was along a similar timeline).

I have no idea what the commission would be -- blender is very complicated, and 90% of the hard part about making stuff work in QC is giving it a simple-to-use patch for people to actually take advantage of. Don't get me wrong, the 10% to port it is probably a heroic effort by itself, but making it usable is that much harder (we're facing that problem with skeletal animation -- how exactly should that get exposed in the QC editor?).

gtoledo3's picture
Re: Animations in fbx format supported by Kineme3D?

When you consider the way that an MD2 file is organized when it has frame info, animation is essentially a freebie in QC, in that it is fairly easy to handle frame sequencing using built in patches by manipulating the structure index value. There are static MD2 files; an MD2 doesn't necessarily have animation info at all. So, by adding one file type, you have at least really basic animation, in an easy to handle way.

It is a pretty old file format, but in that, there's a benefit of being able to handle it solidly, and also the benefit of there being a pretty good amount of MD2's floating around to play with. They're also pretty low poly, small files.

When you start comparing QC to a gaming engine, it's going to fail, at least out of the box, and really right now period. When I look at some of the dynamic lighting, anti-aliasing, etc., going on in modern gaming systems, and then I start loading dae's with some shadows and just try to rotate them with an interpolation patch... it's not pretty, performance wise!

Chris is right on about simplicity of use being key, as well as smooth integration with the rest of QC. If you just have this totally independent system kludged on, that doesn't work in important instances, and isn't obvious how to use, then it evokes a great deal of frustration for the end user. Function, ease of use, and stability, are ever the delicate balancing act!

One can have a loader that loads a Blender gaming scene, or some other format, but that doesn't mean that because that has been achieved, that it's really usable, or that one can interact or change what's going on in that scene. That's the problem with just grafting on a big chunk of something like that- it has to have been designed in a way that allows what someone needs to work with in QC to be exposed in the working environment. For example, I think the OpenEmu stuff is very fun, but besides playing a video game, or filtering the output some way, or rendering the visual onto something... there isn't any way to actually interact with the game file.

In something like that, the thoroughness of integration into the QC environment goes hand in hand with a reason to actually use that tech in QC. What would be the point of running something you can run in Blender, in QC, if not to have every patch be able to solidly interact? There wouldn't be any benefit except the novelty of it.

One gigantic issue in QC right now, slightly apart from character anim, but a big part of it being usable for setting up a 3d scene with physics, is that nothing "knows" where anything else is in any kind of automatic way (for the user).

I think it would be pretty cool to see something like a Blender engine in QC, but I would think a unique engine would probably be way more useful in the long run. There are solid paradigms within QC in place... it just requires careful consideration in making them work in a way that is tailored for 3D stuff.

sbn..'s picture
Re: Animations in fbx format supported by Kineme3D?

Good points, but:

While I can only speak for myself, I'd definitely wiant for a patch in the vein of "Image with .blend file", maybe only with the path as an input. Tier two would be for the patch to be able to listen for inputs "published" via python in the GE, or some sort of communication protocol.

Maybe it sounds weird or like too much hassle, but I do live visuals and sometimes use BGE scenes specifically created for that purpose. I control them with a joystick (could aso be MIDI or OSC), headlessly. This works to some degree.

The meat of the shows I do are through a custom QC system. As it is, there is no way of combining these, QC and BGE, except for an extra computer and external video mixer, which is a hassle and puts everything into standard definition PAL (as well as having bloody awful keying). To me, putting the BGE inside QC would be to leverage what each system is best at. Nowadays the BGE is quite a powerful, optimized realtime system with advanced shaders, skeletal and morph target animation, even rigid & soft body physics! The latest big news to me was the inclusion of direct access to mesh data from Python scripting - the ability to create parametric meshes directly from input or sound analysis / OpenCV data etc.

It's something I've thought about for a while, and I think - for me - that it makes most sense to have it arranged that way around (BGE in QC) if you're going to try to mix the two at all. The BGE handles video, but not very well or as optimised as QC, and its output is in a buffer already in video RAM. For the communication, I could just do what I do now, which is listen to hardware or OSC from within Blender.

Well, I should maybe have started a new topic in stead of rambling in this one, but I'm going to submit this since the discussion is already rolling.

Before I go off to look at the v002 screen capture source -

(incidentally, is video RAM buffers somehow locked to one application?)

  • I will say this: Ultimately I think your stance on the question "Is the Blender game engine a good fit in a QC patch?" depends on your view of what QC is. I find myself creating meta-systems that build upon QC but are dependent on other stuff (my video mixing system already has an informal protocol of sorts on top of what Olivier's made, as well as a largeish Cocoa component). So, in my view, the more building blocks the better, even if they're somewhat non-standard. More hooks, please!

End rant, and cheers for your work, and thoughts!

gtoledo3's picture
Re: Animations in fbx format supported by Kineme3D?

Mmm, agreed. There's a good case for even just being able to pump a visual output of what a window from another app should be into QC in a transparent way, loading it into the QC system, even if you can't manipulate it that much in QC, if it's in the context of a system where you can manipulate it somehow.

It's just a matter of perspective when it comes to approaches like that. I mean, I would rather see native QC solutions as robust as something like Blender but "QC-esque", and then ability to pull in stuff like external file types on top of the "in QC" technique. So, you could create objects and animate in QC, or pull in stuff through loaders and use it's own animation types, but handle it in a way that is kind of consistent even if it is external data.

As crazy as this is going to sound, a real "one shoe fits all" scenario for some of the A/V system stuff you're talking about would be solved by having the utility of being able to launch an App Window straight into the QC viewer. It certainly gives a certain level of integration that is consistent, no matter what the app, or scenario, Oh what a hack that would be! It would be extremely useful though, to have a render patch, that in it's settings, you could go in and pick an app that would get launched by enabling the patch, and then be able to interact in with in in the QC Viewer window the same way that you would in the normal OS/Desktop environment.

sbn..'s picture
Hack to end all hacks.

Ooh, now you're talking! That would be a sweet hack indeed.

Or, alternatively, just "Image with VNC session" or "Image with Virtualbox instance" ;)

I might as well note this here, since I went ahead and asked the good people of ##opengl on freenode. Here's their opinion: Trawling through video memory for contexts / buffers isn't feasible, but in 10.6 there's a new framework called IOSurface. It's supposedly for just this kind of thing, but documentation is lacking as far as I can see. Noone there knew if it's possible to just pass a pointer to any OGL context between applications.

The thing is, of course BGE's output is already well suited to what QC does best, in a buffer in VRAM. Being open source, anyone could patch it to pass out a reference. Sadly, it's probably over my head.

vade's picture
Re: Hack to end all hacks.

QC 4.0 uses IOSurface in its Quicktime import plugin, as does Quicktime X player. Basically IOSurface lets you publish the a texture via a mach_port to an other app, or register it globally for all apps to see.

The docs are there, in the headers in IOKit, Core Video, Core Image and CGL. IOSurfaces are implemented in core image (you can make a surface into a CIImage and vice versa), in Core Video you can make pools of surfaces, and in CGL low level OpenGL library where you can make a texture or FBO attachment a published surface.

Whats not clear is what the drawbacks and gotchas are, because trust you me, there will be many. I don't think this is straight up texture sharing between apps, on the GPU (ie, uploaded textures in VRam), but something like "client storage" sharing, meaning textures that are pointers in main memory (core video does this, and its a way to keep things nice and fast on OS X for textures that change every frame.

My speculation follows because you can make a CVPixelBuffer into an IOSurface, but not a CVOpenGLTexrtureRef. The difference is one lives in ram, the other on the GPU VRam.

Of course, thats speculation.

Either way, this could be hugely powerful for inter VJ app texture sharing. You could publish the output of Modul8 and bring it into VDMX as a source. Imagine that. I need to read up on Mach ports, but im not enough of a low level programmer to really get the gotchas. Id love to work on a framework for inter app texture sharing and a protocol to announce new surfaces and find existing shared surfaces.

Could be fucking amazing

vade's picture
Re: Animations in fbx format supported by Kineme3D?

George: Im trying to get v002 screen capture to work within the QC editing environment (it works in apps that use QC, but some issues prevent it from initializing a context properly in QC editor).

That would get you pretty close to what you want, it sounds like?

gtoledo3's picture
Re: Animations in fbx format supported by Kineme3D?

To reply to both the posts in one fell swoop, and clarify; yes, I'm talking about something way more radical. It wouldn't be so much like setting up a window and getting a grab of the texture, it would be more direct, in that the App would likely have to be launched via a plugin that would wrap the App (launching the actual app in QC Viewer space, launching a temp copy app that is copied to an app support type folder, or embedding in the qtz file... different routes possible on that one), and launch it in the QC Viewer window, using the QC Viewer window in a way similar to the normal OS X environment, with ability to have key, mouse events, internet connectivity, etc., but to be able to render to QC surfaces. I don't think this is covered by the standard API! Something that similar in function to the actual OS would likely be best done by Apple as well... it would be a pretty major thing for an outside party to do, with possible unintentional malevolent consequences.

What you are talking about, with simply getting texture, is really useful... I mean, that's certainly a real problem solver for many in A/V scenarios (if not a total problem solver, a definite mitigator), especially if you could do it in QC. Does the screengrab still work in SL? I haven't tried it since it came out.

sbn..'s picture
Re: Hack to end all hacks.

Ah, I only looked at the IOSurface headers themselves, and couldn't find enough context to make sense of it. I need to understand the search in XCode's doc window better, I'd expect it to pick up on that word in the other frameworks' docs.

Still, fucking amazing is the word!

And, it could work much like your screen cap. That is, no pledging with every VJ app to include it; just publish a plugin, two .qtz's for input and output, and (maybe) an app that acts as a "server" so you have to actively enable / run something to enable hijacking (think JACK server). That approach would ensure it doesn't take years to get off the ground like FreeFrame.

If it's possible...

Incidentally, I've never gotten the screen cap to work in my own VJ app, which runs inside a QCView. That's another discussion, though...

sbn..'s picture
Re: Animations in fbx format supported by Kineme3D?

I still like your idea. You'd probably have to find a way to pose as and implement a window server, though. Or register as a fake (second / third) display, and find out where that buffer would be stored in VRAM. (If I understand the v002 screencap correctly, it just grabs the main front buffer - there should be other standard buffers for other screens, left / right or somesuch, IIRC.)

But, as you say, I'd also settle for "just" the buffer hijacking technique. And a pony ;)

vade's picture
Re: Hack to end all hacks.

Dont use a QCView, use an NSOpenGLView or custom context and make sure you use NSOpenGLPFAFullScreen pixel format option in your pixel format for your context you init the QCRenderer on. Look at QCPlayerPlus example code:http://sourceforge.net/projects/qcadvancedplaye/

dust's picture
Re: hack to end all hacks ?

although it doesn't work in the context of QC it does work in QC builder. imagine all the fun you could have with this. QB "shield" doesn't seem to work with v002screen but that might just be a SL thing not sure. here is an example of some fun you could have, don't worry the app quits in 10 seconds but with screen capture you could have a captured audience for those 10 seconds. if you wanted to take this farther you could do an apple script like tell app x window front most, and really mess with a kiosk if you wanted.

PreviewAttachmentSize
xSafari.zip421.26 KB

franz's picture
Re: Hack to end all hacks.

hey vade, sorry but there doesn't seem to be any file to download, at sourceforge? QCplayerplus

.lov.'s picture
Re: Hack to end all hacks.

you need to checkout to latest revision: http://sourceforge.net/projects/qcadvancedplaye/develop

gtoledo3's picture
Re: Hack to end all hacks.

It has ended up being so far!