Fisheye view plugin?

mrboni's picture

Hi, does anyone know of a plugin for Quartz that will give a fisheye view of the 3d environment?

I found a glsl shader on the web (in text form) but I know nothing about glsl and unsurprisingly a copy and paste job into the glsl shader plugin didn't work.

What would be amazing would be if the creator of the field of view patch in gl tools could modify it to support fisheye. I could provide links to the maths behind it but unfortunately not much else. I don't think it'd be too hard though.

If anyone can help I'll be eternally grateful as I have a lot riding on getting this to work!

Thanks Will

cwright's picture
matrix maths

I'm guessing this can be somewhat accomplished with the Matrix Mode patch -- I don't know the math off-hand, but that's a good place to start.

If you can point me to the example GLSL source, I can QC-ify it for you pretty easily (assuming it obeys QC's GLSL limitations).

If you can provide some math, I can probably whip something up (I wrote the FoV patch in GL Tools ;)

mrboni's picture
Woop! Thanks for the quick

Woop! Thanks for the quick reply. I'll collate some links and post them shortly.

FYI, this is going to be used to let us project quartz patches 360 degrees in a geodome. Maybe you know a bit about this as it's related to the PBMesh plugin on this site?

mrboni's picture

Hi, have a look at these -

PDF describing how to make a gl fisheye shader

The same thing condesed as a course handout

Info from Paul Bourke's amazing site

Let me know if that is enough to go on, and thanks!

(btw, it's a 180 degree angular fisheye that's needed, though I imagine the ability to adjust the angle could be useful, like in your FOV plugin)

cwright's picture

looks like this works in one of two ways:

1) the cpu can transform all the vertices manually -- this isn't possible in QC since vertex data isn't ever exported (though it's possible in kineme3d, if the proper patches were written).

2) a gpu vertex shader can transform the vertices. this works in QC pretty much out of the box.

You'll need Kineme GLTools for the GL Ortho patch.

place a GLSL shader inside the ortho patch. make sure the ortho patch is set to -1, +1, -1, +1, 0.0, 1000.0 (left, right, top, bottom, near, far)

for the glsl vertex shader code, use this:

const float PI =  3.14159265; 
void main( void ) 
   float phi; 
   vec4 pos = gl_ModelViewMatrix * gl_Vertex; 
   float rxy = length( pos.xy ); 
   if( rxy != 0.0 ) 
       float phi = atan( rxy, -pos.z );
       float lens_radius = phi / (PI/2.);
      pos.xy *= ( lens_radius / rxy ); 
   gl_Position = gl_ProjectionMatrix * pos; 

leave the fragment shader stuff alone, or change it to a more useful fragment shader (something with lighting, or no texture lookups perhaps).

And then you're all set.

Since this requires vertex handling on the cpu, or a custom shader in GLSL, I don't think it's an appropriate addition to the GLTools patch collection -- it wouldn't play nicely with existing GLSL shaders.

fisheye.qtz6.95 KB

franz's picture
DL problem

for some reason i'm unable to download this file ...

cwright's picture

are you using safari? if so, right click, and save target as -- otherwise, it opens, and shows a grey composition (since it's all Kineme GLTools-driven, and those don't load in safari, nothing interesting renders).

if that's not the problem, I don't know what is -- I can download it fine logged in (or not logged in) from firefox and safari.

franz's picture

yeah, safari sorry to have disturbed you. the problem was on my side (some tweaked settings somehow prevented to DL it... but not the recent audio patch strangely)

tobyspark's picture
you want full dome...

see what they've been doing in vvvv

amazing stuff

mrboni's picture
Yup, fulldome is the way.

Yup, fulldome is the way. Those links are impressive.

Our project is definately less advanced, but environment is the key. We're setting up a 'fulldome' at a UK festival with 360 degree visuals and surround sound. The aim being to create an 'immersive' sensory environment for all the receptive festival punters. We're going to have people playing live surround sets with accompanying visuals, some filmed (with a fisheye lens) and others generated, like in Quartz.

I'm a Quartz novice so that side will be pretty basic. I'm working predominantly on morphing audio reactive 3d meshes based on things like this -

If anyone is interested in contributing content I would be much obliged, and there could be a free ticket to the festival in it.

@cwright - Thanks alot for your shader. It works! but, I'm having a little trouble with it.

For one, when I remove the vertex shader part (which seems to be just passing the plamsa texture through in your example) it stops working. I'm sure it's pretty simple but I don't know enough about this sort of thing.

The other is that when I put a composition inside that uses a gl shader to do mesh distortion, that shader doesn't seem to work. I think I've read that you can't use more than one GLSL patch in a composition. Is this the problem?

In that case, could it be solved by compiling the shader into a plugin like the FOV plugin? I don't know how hard this is but if this fisheye shader worked in the same way as th FOV plugin it would be perfect.

Thanks again, Will

cwright's picture
fisheye shader

The vertex shader is where the vertex distortion happens -- perhaps you meant the fragment shader? you'll need some kind of code in there for it to work...

You can't nest GLSL shaders (meaning, you can't put one inside another, and have both work at once on the same object). This is a limitation of GLSL, not QC, so there's no real way to "compile" it into a plugin that Just Works, unfortunately. Do you need to have it distort GLSL-deformed meshes? You can probably put the code into each GLSL vertex shader (after all the deformation transformation stuff) and have it continue to work as expected. It's tedious, I know, but that's about the only solution I can think of...

tobyspark's picture
no way... the festival isn't shambala perchance?

and, come to that, even if it isn't, please get in touch:

mrboni's picture
Yup. Email sent.

Yup. Email sent.

mrboni's picture
Yep, fragment shader is what

Yep, fragment shader is what I meant. Is there some standard 'dummy' code I can insert?

cwright's picture

that depends on what you mean by "dummy" (sorry to be so pedantic... this really is a complicated subject that's hard to "water down" without making glaring errors).

If by dummy you mean "Act like normal OpenGL" -- there is a reference fixed-functionality shader somewhere on the internet -- I've heard its easy to find, but I've never had any luck myself. here's a start:

If by dummy you mean "Set a constant color", that's simple (and mostly useless):

void main()
   // white (r, g, b, a)
   gl_FragColor = vec4(1.0,1.0,1.0,1.0);

This gets more complicated as you add texturing...:

uniform sampler2D texture;
void main()
   gl_FragColor = texture2D(texture, gl_TexCoord[0].xy);

... colors ...:

uniform sampler2D texture;
void main()
   gl_FragColor = gl_Color * texture2D(texture, gl_TexCoord[0].xy);

... and lighting (left as an exercise for the reader).

mrboni's picture

I think maybe I'm not quite understanding the implications of using GL shaders.

What I want to be able to do is make a composition with different 3d elements, each with their own colour/texture, and then view that whole scene through a fisheye, maintaining the individual element's characteristics.

I've attached two compositions. One is of a multi-coloured rotating 'tunnel' made from sprites and lines, and the other is the same thing but inside the fisheye shader. The lens distortion in the vertex shader works as it should, but it seems that anything I put into the fragment shader effects everything within it, making everything the same colour or texture.

What I meant by 'dummy' was code that would not apply any transformation to the elements, but leave their appearance as it is. Maybe this isn't possible.

I found 'fragment shader pass through' code -

/* pass-through fragment shader */ void main(void) { gl_FragColor = gl_FrontColor; }

but I get an error in Quartz - 'gl_FrontColor' : undeclared identifier

Octotunnel v3.qtz13.86 KB
Octotunnel v3 [fish].qtz14.62 KB

cwright's picture

In the vertex shader, add this piece:

gl_FrontColor = gl_Color;

right before the gl_Position = ... part near the end.

In the fragment shader, make it

gl_FragColor = gl_Color;

and you should be in business:

mrboni's picture

In 22 minutes, and on a Sunday. You are a legend!

I've attached a composition with the correct shader for anyone else who wants to play.

Many thank yous, Will

GL Fisheye template.qtz1.77 KB

mrboni's picture

I don't want to busy your Sunday too much but I've just tried another composition in the shader and it's removed all the colour and lighting.

Don't suppose you could take a quick look?

1 file this time, just remove the contents of the shader to take a look at the original. It won't actually do anything without midi clock and audio input though.


SpeakerConeBPM[Fish].qtz42.12 KB

cwright's picture
color is ok

This is a more complicated example....

The color's right -- the cubes really are getting all white -- manually change the colors, and the composition output changes. That's all good (the bug isn't with GLSL, it's somewhere in your compo)

The lighting is another issue -- Lighting in OpenGL is Fixed-Function functionality, so if you're using a shader (GLSL), you lose that (by virtue of how shaders replace the FF pipeline...). So if you want lighting, you have to code it yourself. you'll need the normal, and then some extra glue in the fragment shader. It's pretty well-documented stuff, I think I posted a link to shaders that do that earlier in this thread. If not, Alex Drinkwater surely has some good lighting shaders; he could point you in a good direction. On the plus side, you can do per-pixel lighting, which looks stunning in my opinion :)

mrboni's picture
You're right it is just the

You're right it is just the lighting. I'll investigate..

Is there a similar issue with textures though? Images/textures don't show up inside the shader.


cwright's picture

Inside a GLSL patch, texture is done at the shader level, rather than the object level (meaning you attach the shader to the GLSL Shader, not the objects inside). Then, you have your fragment shader to the texture stuff you want it to do -- all objects inside will have the same texture (this is a limitation of GLSL, and happens even outside of QC).

In a totally-not-personally-attacking kind of way, and if you're programming-inclined, and have the time, spending a day or two coming up to speed on OpenGL and GLSL would help clear up the apparently unusual behaviour surrounding GLSL inside of QC. It's pretty simple to pick up, and then everything starts to make more sense. :)

(feel free to keep asking here, you've got fun questions and interesting compositions :)

mrboni's picture
Back for more

Hi again Chris, don't worry, you didn't scare me off before. Been busy working on things and have actually made progress with the above.

I'm most of the way there but there are still a couple of issues -

How can you modify the properties of a texture when the modifications are taking place within inside the shader object?

I've attached two files, one fisheye and one not, of a load of concentric iterated spheres each with video in as the texture. The audio peak of each sound frequency band affects the texture translation, resulting in lots of 'spinning' spheres.

When I use this in the shader, I have to feed the video into the texture part of the shader, and then have no more control over it, so cannot join it up to the jiggery pokery happening inside the iterator.

Is there any way round this?


tobyspark's picture
matrix maths

i've spent ages looking for some kind of answers on what those matrix maths numbers need to be, and bugger if i can find any answers. opengl et al all have functions for feeding in field of view etc so the actual numbers don't seemingly exist.

so if anybody knows any 3d programming masters or suchlike, badger them for a 180 fisheye projection matrix!

toneburst's picture
Matrix Maths

I don't think you can do a fisheye effect just with matrix maths in a vertex shader. You can use matrices to change the perspective of a mesh or scene, but afaik, you can't bend things around into curves using matrices, for that you need parametric sin/cos functions, like the ones cwright wrote for Will.

I'm no expert though....


tobyspark's picture
field of view patch / qc camera co-ords

further brain-dump

thought the field of view patch explodes at 180, but it is actually processing the image, just its squashed to nothing in the centre. other fiddling makes it look like its correctly altering the field of view, but qc's camera position means the image is squashed to nothing in the centre because the camera isn't at 0,0,0. its somewhere further up the z axis, looking down it.

the problem as far as i can see it is that you can't move the camera, and moving the scene to the camera hits the near clip plane.


feature request: magic fix =]

hack investigation: what is the default camera position, field of view, clip planes of qc? i thought i worked it out, but i've gone round in circles so many times with this i'm not even sure any more.

toneburst's picture
QC Camera

In OpenGL (and by extension, QC) there is actually no such thing as a camera. The view position is always at the origin of the world coordinate system- 0.0,0.0,0.0. So, to simulate the effect of moving the camera, you have to move everything in your scene in the opposite direction. This is done behind the scenes by QC by manipulating the gl_ModelViewProjectionMatrix.

Things get a bit weird when you start using 3DTransform patches though- because the coordinates of objects inside the 3DTransform no longer necessarily correspond to their coordinates in the 3D world space, so it all gets a bit confusing.

Also, it's completely possible that Apple may have 'moved the camera' in QC- ie offset the coordinate system so that on object at 0.0,0.0,0.0 is actually at 0.0,0.0,1.0, for example, in true world coordinates. This would actually make a lot of sense from a usability angle, since otherwise, placing a sphere, for example, at 0.0,0.0,0.0, would put the camera inside it.

Of course, this all may be complete nonsense. Coordinate systems and dimensions are something I've always struggled with in QC (especially with the Image With String patch, which I find really confusing).

Whether this has any bearing on what you're doing, I don't know.


psonice's picture
image with string

I've found the only way I can get meaningful results from that patch is to set it to pixel dimensions, set a texture size and then use normal font sizes. Otherwise I seem to end up with either something blurry, or a massive texture with odd dimensions.

So you're not alone there!

Going off on a small tangent.. how do people use the coordinates in QC? I find the normal -1 -> +1 screenspace system works well for layout, and the 3d system is fine (complications like the one above aside), and pixel measurements for pretty much everything else. Anyone found a use for the 'use normalised coordinates' option?

toneburst's picture
Image With String Dimensions, Hair-Tearing

I worked for ages on an ASCII-art qcFX for VDMX once, back in the pre-Leopard days. Very frustrating. I got some nice results, but ended-up getting very confused, and hit a few dimensions-related bug I never managed to squash.

I keep meaning to go back to it, but haven't got around to it yet.


tobyspark's picture
had another go; still stuck.

had another go; still stuck.

i can pull a z-axis translation using the kineme matrix macro patch - the top three numbers of the rightmost column are x,y,z translation from origin to origin - which seems to have a different effect from the 3d translation but still doesn't get me any nearer what i need.

alx - the numbers you have available in qc are not true world coordinates, evidenced by that night of fiddling, and there's no way i can see of fiddling anything to effectively put the objects in a sphere around the camera such that a 180fov correctly captures that view as a fisheye image.

the image with string patch does make sense tho? the font size is the character height in qc coords, with the standard caveats about fonts blocking themselves. if you set a width and height it will act like a paragraph box. and if you are designing to a set resolution, change it to use pixel dimensions. my StringToImageStructure patch gives out rendered characters with dimensions in pixels of each character if that helps.

cwright's picture
FoV, 180°, Communism, and You!

I think the magical frustrating part of this is that setting FoV to 180° causes a degenerate matrix (the math involved in the FoV matrix setup uses tan(theta/2), and when theta is 180, you get tan(90), which is +-infinity (make sure your calculator is in degrees mode, otherwise you'll get tan(90 radians), which isn't wonky, and you'll try to tell me I'm an idiot). This, of course, makes everything really really wonky and tiny, because of its 'degenerateness'.

As for moving the QC Camera -- not really presently possible, BUT:

For the next beta of GLTools, we're working on a magic ModelView/ProjectionView matrix patch, which will allow you to specify any arbitrary GL matrix you want -- shears, asymmetrical frustums, etc will all be possible, and as an added bonus, you'll be able to scoot the camera around too with this. If there's a non-degenerate way to make a 180° fisheye projection matrix, this will allow it as well, though I'm not sure there's really any mathematically stable way to accomplish this, as a matter of principle. Maybe pbourke can hop on and set this straight though.

If there's not, you'll have to settle for almost 180°

NYDave's picture
Waiting eagerly, but in practice the FoV

...projected in a fulldome is often less than the full 180 degrees. The positioning of the projector may be below the equator plane of the hemisphere. Since there is really only one spot in a dome where the sight lines are ideal, and that is occupied by the projector, there is room for some fudging of the perfect projection, especially (in my opinion) in the case of non-photographic generative and animation material. The eye adjusts, and immersion is a leap of faith in any case.

All that said, I wait eagerly for the new tools, and trust that they will be available and well-tweaked for the opening of our dome in 2010. Further! and Wider!

(oh I get it - communism and degenerateness. From each according to his abilities!)

NYDave's picture
Anticipating a correction

Someone will no doubt write in that with dome projection based on multiple blended projectors on the periphery, or a hemispherical mirror, rather than a central fisheye, the perfect spot may not be occupied. But the majority of fulldome sites coming online now use the single projector system.

tobyspark's picture
i consider my self

i consider my self pre-empted =]

and thanks for that chris, although i still think the root point is valid in this pre-next-gltools world, in that as the fov gets wider, there is some voodoo in qc that makes it appear far further away and so tends to a dot in the centre of the canvas - whether thats a camera distance as i concluded or not.

immersiveness, dome environments, and the 'perfect position' is a whole other thread that i won't derail this one with, now having run tests in both a planetarium (mirror but dome tilted in front of raked seating) and dome tent (mirror onto hemisphere starting 2m high, ie no shadows but with a horizon above everybody's head).

Crispy75's picture
Re: field of view patch / qc camera co-ords

"hack investigation: what is the default camera position, field of view, clip planes of qc? i thought i worked it out, but i've gone round in circles so many times with this i'm not even sure any more."

I'm in the midst of investigating this, along with using PBmesh to translate a straight-line perspective, wide FOV view into a fisheye view.

I'm playing with a FOV of 170 at the moment, although may move to 160 or 150 to reduce the distortion (nearly half the view width ends up rendering the edge 10 degrees of the FOV). The translation from perspective to fisheye will require a custom .data file, which seems to be to be a simple bit of trig. I will report back here with my success or failure.