GLSL

glPolygonMode wireframe quads

bonerton's picture

Hey, there used to be a Kineme patch to render wireframes in quad mode (Kineme GL Grid Renderer), but it seems to have disappeared recently? Is there a way to do this with a glsl shader?

Raytrace Core Image (Composition by gtoledo3)

Author: gtoledo3
License: Creative Commons Attribution-NonCommercial
Date: 2012.11.11
Compatibility: 10.5, 10.6, 10.7, 10.8
Categories:
Required plugins:
(none)

Nothing super fancy, but a raytracer working in core image.

I'd been thinking about doing this for awhile, from being interested in the way core image always has a dedicated image output without having to render to screen/texture to do something with the result.

It's a little bit of a pain because of typically used functions and working styles not being supported, but it works if you stay within the boundaries. It seems a little slower than GLSL. Less substantial/2D oriented fragment shaders tend to perform better when transferred, seemingly.

How to not overwrite a models texture with the glsl fragment shader

jrs's picture

Hi All,

How do I keep glsl from overwriting the texture details that get loaded with a model? I've attached an example composition where I texture the teapot with its image port. This works fine outside a shader but as soon as I place it inside my model goes white.

Apart from having a search on the web and not being able to find it I came across this thread - https://kineme.net/Discussion/GeneralDiscussion/Fisheyeviewplugin

which at one stage talks about the same thing. Chris mentioned adding the following to the vertex shader:

gl_FrontColor = gl_Color;

And the following to the fragment shader

gl_FragColor = gl_Color;

but it doesn't work. I get the feeling this is something simple and easy but I just can't see where I'm going wrong

Cheers James

ofxComposer

gtoledo3's picture

I mentioned this tech awhile back, and it met a big yawn around the forum, though I had some pretty interesting back and forth with some forum members and other QC users offline.

The system is still fairly alpha but I like it very much, for these reasons -

-OpenFrameworks is a LARGE active community.

-It's an opensource project -ofx and ofxComposer

-It's GLSL shader based, and I think that there's a good future in GL and GL shader tech. Many of us are already well familiar with it.

-It immediately provides video input, and lets us route that to a shader. So much traditional QC stuff revolves around taking an image, giving it an effect, or calling up different "cool visual" scenes. This much is basically already possible.

-There are some great ideas like quad warping and masking on an actual patch level. It kind of has to be seen in action to be appreciated.

Last I used it, it was not truly production ready, it's very new. I could probably setup a file, do a few things on the fly live, but there were some issues with file restoration and loading shaders that have errors, and what subsequently happens.

I like it though, because, aside from the active ofx community, there are so many addon frameworks that lend themselves to the kind of things we do already (CV, motion detection, GUI's, all kinds of stuff).

It's not unimaginable for something like non-shader objects to be represented in ofxComposer, because hardware objects (like camera and kinect) already can be. Now... I am not crazy about some of the ofx methods and having to learn some new abstractions sometimes, but it's not rocket science stuff either. I find myself thinking it's worth it.

I'd recommend that anyone who wants to checkout ofxComposer takes a look at the project - I seem to remember Patricio having his own branch setup with dependent addons to make it compile, at least before the most recent ofx version update. I wrote Patricio tonight to see current status of this (beyond what I can grok from the github) because I'm interested in contributing on this one as time and competency allows. I mentioned this awhile back and didn't get much interest, and I can't blame anyone, but I think it would be helpful for us to jump on some opensource stuff that's similar to QC in function and that we can have a positive influence on.

I'm not recommending this or the Three Node.js techs as any kind of alternative to QC - far from it. I just think they have some overlap with the kind of stuff we do, and it would be cool if QC-ers could influence these technologies early on :-)

Ashima Noise External Texture Warp (Composition by gtoledo3)

Author: gtoledo3
License: Creative Commons Attribution-NonCommercial
Date: 2012.06.25
Compatibility: 10.4, 10.5, 10.6, 10.7
Categories:
Required plugins:
(none)

This is a GLSL shader that is a "texture warp/distortion" effect.

It uses the Ashima Noise implementation that's been kicking around lately, that I particularly like and think looks nice and organic.

I'd used it for generating patterns, but had the idea a month or so ago to use the noise to perturb in input texture - which didn't occur to me at first since the whole deal was that it's a "textureless" noise implementation, but I'm glad it did!

After I had that hanging around for awhile, I decided to add some simple "feedback" loop facilitated lighting, which has a kind of eerie look :-)

I've been able to generate really cool looking abstract landscape type looks by feeding textures that have those kind of colors going on, and macerating the texture to taste. I've also achieved some pretty gross "melting flesh"/"monster" looks as well. I've had some fun with this filter in the past while, and I hope you all do as well. :-)

Go through the "Mode" values to checkout the various basic looks, 4 Modes in all.

"Amount" will increase the amount of the rippling, while "noiseFreq" will tend to increase the density of rippling per area.

"Speed" controls the pace of distortion fluctuation.

"Bump" will do some stuff to the .z channel to mess with the lighting a bit in modes where it's active.

"Spot R1/R2" control the throw of the mouse active light.

"gammaOn/gamma" controls gamma (psonice's gamma code...if it works, why not use it?).

Then there's some color channel offset stuff.