GLSL Hardware Problems?

TheRandomDude's picture

Hey hey,

I'm having some major issues with GLSL when I want to deform vertices, usually forcing me to use the software renderer (slow as poop). Why is my graphics card freaking out about this code, bug or is it something I'm missing?

Vertex Shader -

uniform sampler2D displacementMap;
 
void main(void)
{
   vec4 newVertexPos;
   vec4 dv;
   float df;
 
   gl_TexCoord[0].xy = gl_MultiTexCoord0.xy;
 
   dv = texture2D( displacementMap, gl_MultiTexCoord0.xy );
 
   df = 0.0004*dv.z;
 
   newVertexPos = vec4(gl_Normal * df * 100.0, 0.0) + gl_Vertex;
 
   gl_Position = gl_ModelViewProjectionMatrix * newVertexPos;
}

Fragment Shader -

uniform sampler2D colorMap;
uniform sampler2D displacementMap;
 
void main(void)
{
   vec4 bumpdown = texture2D(displacementMap, gl_TexCoord[0].xy)+0.4;
   gl_FragColor = texture2D(colorMap, gl_TexCoord[0].xy)*bumpdown;
}

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

cwright's picture
displacement

(qc has a long history of bad glsl displacement with texture maps -- this seems to work without exploding, which is a good first step :)

what exactly is the problem? Just slow performance? What hardware are you on? Are you getting any errors/unexpected behaviours? I ran it on my old MB (gma950-'powered'), and it hit >80fps with a sphere and torus inside, displaced with plasma.

As far as I've been able to glean, all texture lookups in vertex shaders cause an instant drop down into software mode :( even for cards that should know better, the drivers just suck or something (except in the case of NVidia, then they don't just suck, they /SUCK/)

TheRandomDude's picture
standards

Right now I'm just using the GL Polygon Mode to coerce OpenGL to run it through the software renderer, but it's not a proper solution. I'm on a Black Macbook with one of those silly integrated Intel GMA X3100 graphics chips, mostly as a baseline for deployment. You can see the before and after down below.

I've also been wondering if I can parse out all the texture processing over a longer period so that I don't have to black out the screen every time I generate a new planetoid, You should see the lockup when I generate these massive 2048x1024 textures :D.

PreviewAttachmentSize
Picture 6.png
Picture 6.png212.91 KB
Picture 7.png
Picture 7.png241.79 KB

gtoledo3's picture
Nice... I like what you are

Nice... I like what you are doing and was trying to go up a similar alley, but I actually kept getting messages like "fatal error, too many pixels" to paraphrase. Weird stuff. I was trying to do some extreme supersampling on the end result though.

TheRandomDude's picture
Well this thing isn't ready

Well this thing isn't ready for primetime, but feel free to take a look at the composition. Make sure you have most of the plugins (I haven't rooted around yet to see which are needed but probably GL Tools and Texture at least). It's all pretty rough draft but I think I'm getting somewhere.

PreviewAttachmentSize
Moonshine2.qtz49.18 KB

psonice's picture
kineme texture = displacement compatible?

It looks like the kineme texture stuff is outputting images with 'generic rgb' colourspace - from what I remember of my investigations into this, that's the one thing that will actually work with displacement shaders. It shouldn't be necessary to use software GL if that's the case.

Otherwise, I have a (slightly buggy, but working) plugin to remove the colorspace info. You can then use any RGB texture with it (i.e. pretty much anything except raw camera input).. let me know if you need that.

toneburst's picture
I think you still need to

I think you still need to convert it. I've used it for VDM, but I seem to remember I still had to 'sanetize' it first.

a|x

TheRandomDude's picture
I'd love to have the plugin

I'd love to have the plugin to give it a shot, but even some background on colorspaces in quartz composer and how to sanitize textures would be extremely helpful.

I'm really jumping straight into GLSL (I even bought the orange book) and any dynamic learning I can get would be great.

toneburst's picture
Good luck

with the Orange Book. It's really useful on the language itself. You might find that a lot of the shaders won't work in QC though, because they need access to data that isn't accessible in a GLSL Shader patch in QC.

a|x

acrusum's picture
Custom patch that converts any image to a CGImage?

Using the private Image Debugging Info patch, I noticed that only textures using the base class CGImage work as inputs to a GLSL vertex shader for displacement (which is why Kineme Texture renders work). The Video Input patch produces images with the base class of CVPixelBuffer and nearly everything else produces images as a CIImage. (I apologize in advance if this is old news.)

I've been trying to write a custom patch that converts any image into a CGImage but I've never worked with Cocoa before and I'm a bit confused by all of the different ways of dealing with images. Can someone point me to a good primer on NSImage, CIImage, CGImage, NVPixelBuffer, etc? Or let me know if I'm on the wrong track completely?

cwright's picture
details

The details for this are somewhat known (psonice has done a ton of empirical testing on this topic, franz has done a bit, and I've done technical investigations) -- you're basically on track, though ColorSpace also seems to play a part in it.

There are subtle reasons for each image type:

  • CIImage -- hardware accelerated bitmap data, usually stored on the video card. usually resolution-limited to the texture size of the video card. These are actually more like recipes for images, rather than images them selves.
  • CGImage -- software bitmap data. stored in system ram.
  • NSImage -- this isn't an image, but a collection of "Representations" which are images. (NSBitmapImageRep, etc.) These abstract a lot, and aren't useful for much directly (but can be converted to other images usually).
  • CVPixelBuffer -- video data (unlike the above, this is usually a YUV 4:2:2 format -- the memory layout tends to be weird, and is usually hardware accelerated for display, but it requires a hardware colorspace conversion as well).

Converting images into CGImages isn't interesting or difficult, however in QC it tends to be embarrassingly slow.

If you're using the official API, you might not have much luck, since images are all abstracted to QCImages. If you're using the unofficial api, you'll want to check out the image to pixel buffer stuff documented somewhere on this site (around when leopard was released, smokris did some investigation into this -- maybe the source for it made it into some other plugins that we offer the source for now?)