I don't even know what to call this...

I think that some of this is covered by particle tools, and I know I can get some similar effects going with Kineme3D, glsl shading and logic ops.

I also know that you can put particles in things like toneburst's port of the desaxismundi shape's and they "take on" the correct shape.

So, I almost wonder if things like this "explosion" are kind of possible, and I just don't know how to achieve it, or it is overly difficult to?

This fluid simulation.... is fairly amazing, but I'm not sure how much of it is really computer generated either.

I'm actually going to mark this as "started active".... because I may be crazy, but I think that we aren't that far off on achieving this kind of stuff already, and just comparing some notes might get us somewhere...

I promise I will dig out some of my GLSL/ Logic Op deformation/fake explosion stuff (scratchpole I KNOW I still have failed to dig out the skeleton deformation... but I PROMISE I'll find it, I just named it something stupid or saved it in a weird place), as well as some "desaxismundi particles" and post them on here! If anyone has been working on similar concepts, out with them! :o)

dust's picture
realFlow

looks like some of this is done with real flow. real flow takes fluid dynamic simulation and renders out the simulation as polygon meshes that you then use in 3d software. the hard part is rendering out the textures to look like fluids. you start to see the meshes i talk about when the number 2 comes in. basically you take a container in this case the a number, that is a 3d object, and you do particle simulations, like filling the container with particles. depending on your setting, like viscosity or what ever the particle simulations may take some time to actually calculate. i started to just fill up a cup with particles and i got about half full or empty depending on your perspective on life before i decided to render particles to mesh, that was maybe an hour or so.

not sure how you would go about doing this sort of simulation in real time in quartz but please keep us informed. i'm pretty sure all of what is in this video is computer generated. like i said real flow is a program made to do this sort of thing.

gtoledo3's picture
Well, I saw some stuff that

Well, I saw some stuff that Steve Mokris posted yesterday on Vimeo about a Particle Tools update...

And I am pretty sure that if I can feed "round balls" and or "curved planes", or other 3d objects... that along with the Kineme3D it won't be that hard to get stuff that looks like this. Of course, it would be awesome of there was a patch that was more straightforward than the already somewhat "convoluted" one that I know that I would have to setup to achieve this whenever the 3D render option is added to particle tools (which would basically involve spherical fbx's, stretched/manipulated with the 3D tools, and with vertex shading... and a LOT of manual tweaking).

I totally understand what you are getting at about the "container"... that is what can be achieved right now using very specific GLSL shading... but is mostly limited to REALLY basic shapes... the desaxismundi HLSL to tb GLSL stuff has some more developed shapes, which is what got me thinking along these lines.

Right now, the particle tools generates lines/sprites/rectangles (I'm not sure if I'm using the right term here)... but no "roundness". From what I have already been doing (not posted anywhere yet), I'm pretty darn sure that I basically need a "round" input to get the liquid stuff going...

The stuff where the "1" gets blown into "rocks" and crumbles... is really cool too. I'm digging that Steve's posts suggest that there is 3D input/render in the upcoming particle tools, but I wonder if a wind or explosion forces could act upon the vertices/polygons in the object... It would be great to set a "wind force" at x=-1, y=+1, z=-1.... amp up the strength of the wind... and have the polygons in that area start to "flutter" or even blow away.

The stuff with the blue glassy textures can definitely already be achieved. There are also some things where the number does these little "offshoots".... which can also already basically be achieved (but it's a p.i.t.a). You basically have to setup a "multi object" object scene, that looks like "one object". Then you have to wrap a deforming shader around whatever part you want to "explode"... so you have the impression of the most of the object staying the same, and just a part of it deforming/exploding/wiggling/dimpling (depending on the shader code of course). Even then it isn't exactly what is happening here, and is far from intuitive.

gtoledo3's picture
filling a jar?

I've done plenty of spherical and torus shader with particles, and I wanted to post this because I believe that you can basically write the shader to approximate the final shape desired for some of these effects... I would bet that if I was generating orbs or some kind of 3d objects inside of this using the kineme type particle forces, and then actually altering the shader "shape" it wouldn't be that hard to start getting liquid like effects.

This is a derivative of toneburst's fish shape from the desaxismundi solids example, but with particles instead of glsl grids... in an effort to show how you can "fill the jar" so to speak. Actually, I probably should have amped up a few of the parameters to show how the object can look like a total "solid" object if there are enough particles. Oh well, take my word on it :o)

cwright's picture
dynamics

This is just a physics/dynamics engine, popular in most video games these days. You can take a mesh (convex hull), and break it into pieces. Then, you simulate the physical properties of them. Pretty straightforward stuff (in a text-book kind of way).

(That's the first part).

The fluid part is a bit more tricky -- fluid simulation. I don't have many details for that (because it's almost never realtime when you get to good fluid sim) -- you could do marching-cubes and some hacks to get close to it though, I guess.

(that's the second part).

The 3rd part looks like it might be possible with particle, though it would require a lot of them, and it wouldn't be particularly fast. A better way would be some volumetric simulation stuff (I only know a little about these) that approximates clouds/smoke using fancy shaders.

gtoledo3's picture
So I guess in essence the

So I guess in essence the first part could probably be achieved now, except that it would be a major pain in the ass :o) It wouldn't be hard to prepare the appropriate fragments and just have them move to their places with 3D transforms, or by adjusting coordinates on the renderer.... just time consuming!

But, from some of the stuff Steve posted yesterday, with the addition of vertex shading, and some logic op stuff, would probably get pretty close to what I would have in mind as far as weird wobbly, blobby, and shards type of things...

That new particle tools looks like it is going to be pretty awesome!

Mmmmm.... FANCY shaders. "Those are mighty fancy shaders you have there boy, whatcha doin round these parts with those....?"

I wikipedia Fancy shaders and get no results, WHY IS THIS CHRIS??????? :o)

cwright's picture
seen nothin' yet

The next particle tools will indeed be cool (esp with the kineme3d bindings you've seen... so kind of smokris to leak them ;). However, it's still build on the old engine we've been using (thankfully, by having Kineme3D do the rendering, we get to avoid its wonky GL context hackery).

Once we rewrite the engine on our own (and find the proper convergence between Kineme3D, Particle Tools, Physics, GLTools, and everything else), it will be pretty cool. That is, if we get to it before snow leopard changes the game (just as leopard did for us last year). shrugs we'll see what happens.

fancy shaders (for volumetric stuff) = calculate the distance through "stuff" that the pixel goes through, and act accordingly (with proper deterministic 3d noise, and possible scattering). I've heard rumors of these kinds of things, but I've never actually seen them with my own eyes... (not sure if they're "shaders" in the GLSL sense, or in the RenderMan sense -- guessing the latter, for technical reasons).

gtoledo3's picture
oh yeahhhhhh

cwright wrote:
The next particle tools will indeed be cool (esp with the kineme3d bindings you've seen... so kind of smokris to leak them ;). However, it's still build on the old engine we've been using (thankfully, by having Kineme3D do the rendering, we get to avoid its wonky GL context hackery).

Ah! Understood... that makes sense that this would have been based on the old engine. On an aside, I do admit to wanting to throw in a GL Clear Depth on one of those fish comps so that you wouldn't see it go through the line... lol, I wish I was so detail oriented with my OWN comps. I am assuming that would work but I may not be understanding that patch correctly yet... the more I poke around all of the GL Tools, it is like the implications are so deep for a lot of compositions, and you almost have to go back to square one and totally understand that plug-in inside and out to make the most of compositions.

cwright wrote:
Once we rewrite the engine on our own (and find the proper convergence between Kineme3D, Particle Tools, Physics, GLTools, and everything else), it will be pretty cool. That is, if we get to it before snow leopard changes the game (just as leopard did for us last year). shrugs we'll see what happens.

I am glad to see you throw out the word Physics plain and clear. The part of the Particle Tools that is the REAL gem, even more than the particles themselves, is the actual physics/forces aspect of it all. I keep imagining something like being able to place wind at a given point, strength, radius... and have it actually blow against something like a block and pushing it. Being able to program "drag/resistance/bounce" on that block. I would LOVE magnetic forces/strengths with +/-. Black hole force... set position, diameter, strength... when a particle flies by the "black hole" it might fly by the "rim" or get swirled right into the hole...

cwright wrote:
fancy shaders (for volumetric stuff) = calculate the distance through "stuff" that the pixel goes through, and act accordingly (with proper deterministic 3d noise, and possible scattering). I've heard rumors of these kinds of things, but I've never actually seen them with my own eyes... (not sure if they're "shaders" in the GLSL sense, or in the RenderMan sense -- guessing the latter, for technical reasons).

Yeah.... I was just being a wise ass. I remember looking at the moving type from Quartz Crystal, which kind of reminded of the Lorenz Attractor in some ways... and looking at it in different GL Logic modes, with different clear settings. You can get an interesting natural blur going as you crank up the iterations, and I kept thinking that it might be possible to fluid like/smoke like effects with that, but I forgot until this very moment.

Nah, I really don't think there is a way to do that with GLSL shading in QC, at least in THAT way, or any kind of straightforward manner... it is the calculating the pixel that is making me go, "eh, probably not possible". Not that I am any kind of old hand at GLSL, but it seems like at this point I have poured through all the major GLSL literature, and sure haven't got that one going :o)

On another GLSL note there are some GLSL shaders that at first glance aren't quite setup to work in QC because you need certain kinds of inputs... but I have noticed that you can simply setup the correct type of shifting noises using "textbook" type of examples, etc., in a render in image, and then pass it to the appropriate GLSL input... except that it is not a performance dream.

As long as I am wishing, throw in a "hair" deformation patch in Kineme3D!!! Instantly deforms any object and makes it "hairy". And then make the hairs be able to wave with Particle Tools wind (ok, I know that's not going to be possible, but how impressive would that be?)

toneburst's picture
GLSL Stuff

gtoledo3 wrote:
cwright wrote:
On another GLSL note there are some GLSL shaders that at first glance aren't quite setup to work in QC because you need certain kinds of inputs... but I have noticed that you can simply setup the correct type of shifting noises using "textbook" type of examples, etc., in a render in image, and then pass it to the appropriate GLSL input... except that it is not a performance dream.

There are a whole load of GLSL effects that can't be done in QC, as far as I can tell. Anything that requires the tangent vector to be passed with each vertex (eg displacement/bump-mapping) can't really be done, unfortunately. I did mention that this would be cool to including in Kineme 3D, but cwright didn't seem keen...

In terms of other input types that aren't supported by the QC GLSL Shader patch, the most annoying one is probably matrices. You can declare a number of uniform vecs though, and construct the matrix inside the shader (with a small performance hit, presumably). I've done this on a number of occasions. It would be really cool if the GLSL patch accepted structures as input, and converted them automatically to arrays/matrices/vectors, but maybe in the next version...

Quote:
As long as I am wishing, throw in a "hair" deformation patch in Kineme3D!!! Instantly deforms any object and makes it "hairy". And then make the hairs be able to wave with Particle Tools wind (ok, I know that's not going to be possible, but how impressive would that be?)

There are a number of ways of making meshes 'hairy', I think. One method involves duplicating the geometry and expanding the duplicates to make 'shells' around the original mesh, them messing with the alpha on the textures. Displacing texture coordinates can make the hair appear to wave.

a|x

gtoledo3's picture
That's the exact hair method

That's the exact hair method I was thinking of! It is funny that I read some people "complaining" about this method and it not looking "real"... but it looks pretty darn good to me.

Interesting notes about the GLSL and your findings ( I feel like a friggin' cave man explaining this stuff, using wrong terms all the time.... but I SWEAR I know what I am doing... kinda).

cwright's picture
commander keen

[This is a presentation of facts and decisions -- not intended to sound aggressive or dismissive]

I wasn't keen on tangent vectors only because that immediately takes Kineme3D from "generic" (works in GLSL, all transformation patches, all GLTools patches, etc, without any special considerations) to "specific" (only works in a few places, not what user expects) (The Mesh Blender in Kineme3D is one such special-case GLSL renderer, and to date only 1 person has ever used it for anything that I'm aware of... lots of effort to make it go, but it's not generic enough to help anyone).

OpenGL has the following per-vertex properties:

  • Vertex position
  • Normal
  • Texture coordinates (8 sets, actually)
  • Color (front and back).

  • GLSL adds 8 "vertex attributes" (vec4s that can be anything you wish)

That's all. Tangent isn't on the list explicitely, so to handle it it has to happen with one of the following methods:

  • Stuff it in a texture coordinate, and hope no body uses it (isn't safe in QC, since ports are free to nuke texture coords for their own devices)
  • Stuff it in a vertex attribute (this is the correct, portable way to do this Outside of QC -- But it requires that all shaders used expect data in this format as well, or else things end up breaking).

So, for everyone who isn't using tangent vectors (majority of users), they get a free performance hit, and extra memory usage to boot. And, to top it off, certain shaders that collide with the tangent vector attribute will start to behave erratically. Not fun :(

Also, using tangent vectors requires custom GLSL shaders, making existing shaders rather difficult to work with since they can't be combined/nested without some work. I could probably get away with this (since GLSL's not really solidified in QC), but it isn't The Right Way to do this (there needs to be more cohesion between the shader and the model, and the connection between data and shader needs to be made more obvious).

At the time this was all cooking (late 2007, early 2008), Snow Leopard was a dream on paper, nothing more. We saw evidence of 3D stuff in QC from its roots (PixelShox had an obj loader, and some deformers), and we figured they'd start bring that forward. At WWDC, we discovered the surface of how much, and after some reverse engineering of the developer seed more recently, we discovered basically the full extent (It's going to be freaking amazing, but that's all I'll say for now). With that lack of knowledge at the time, we opted to go conservative, and not hard-code vertex attributes in the hopes that no one was using them/would use them in the future (that last part's the killer).

Don't get me wrong -- I REALLY REALLY want QC to have an amazingly powerful shader architecture. I really do. I have some ideas to take it up a step. But until the basics are in place, hard-coding advanced stuff is only going to get in the way once those basics /are/ in place. That takes time, patience, and room to explore... :/ It should be much more doable in Snow Leopard (and done cleanly), so everyone gets what they want, without having to resort to ugly, special case hacks.

toneburst's picture
I knew you had good reasons.

I knew you had good reasons. I'd just forgotten what they were ;) Very clearly explained there though.

a|x