Stringing together QCRenderers?

sbn..'s picture

Hello.

When doing live visuals, I use my own cocoa + QC app. This has gotten fairly unwieldy, and I'm warming up to do a rewrite from scratch. I have some examples and sources (CoGe, qcAdvanced, etc), but would like to hear the experienced devs here on a specific question.

First, the problem I have is that my main composition is this big, monolithic mess. The problems I've solved from the Cocoa side are ones that aren't easily handled inside QC, like signal routing, movie thumbnails, knobs & controls, selection of generators / effects / mixers. (Right now, the signal routing etc. has a good mix of javascript too, which I'd like to avoid in a rewrite.)

Especially the selection of "blocks" (generators, effects, mixers) is a problem. I have a number of composition importers whose paths I can change from the Cocoa UI, e.g. to change effects. Whenever I do this, the composition freezes for a few frames up to a second. Other stuff can cause this, too, like changing movie paths inside a generator.

On top of that, I'd like more preview options, and a smarter system that can turn rendering on and off as needed on specific parts. I'd like to find a method that would ideally enable channel A to hang / freeze, while channel B plays smoothly on, so the effect of a freeze isn't as jarring on the final output.

So, my question is this: Is stringing together several QCRenderers a bad idea, performance wise? By stringing together, I mean load, say, two separate qtz's. The first one, A, has a published image output, B has a published image input. The Cocoa controller then listens for changes on A's output, and forwards them to B's input.

Will this give me worse performance than keeping everything inside a single qtz? Will the image have to leave VRAM? Is there a better way to do this? Do I need to do anything specifically wrt. threading?

Any thoughts, examples or documentation links are appreciated!

cwright's picture
Re: Stringing together QCRenderers?

I'm assuming you're going to have just one NSOpenGLContext, and init all the qcrenderers to use that context, correct?

That should work. There'll be some additional overhead (the fixed QCRenderer overhead will be multiplied), but also some savings (all the dead portions of the monolithic graph will get ignored, using smaller "modular" qtz's instead). Profile to see how this helps/hinders, as it'll depend heavily upon how the compositions are structured.

regarding threading -- you don't need to do anything unless you explicitly create additional threads. If you start some threads in the hopes that QT-related freezes will go away, please give up all hope and try writing a custom plugin instead -- Movie Loader doesn't work off the main thread now (from what I've seen -- perhaps fixed on Snow Leopard?)

If you use the QCImage opaque type for image transfers between renderers, you shouldn't ever leave vram, so that should be quick and painless.

.lov.'s picture
Re: Stringing together QCRenderers?

Some Cocoa UI elements (like popup menu, contextual menu) will block the main thread,i think this happens when you talk about 'compositions freeze'. Using a CVDisplaylink instead of NSTimer driven-rendering routine will solve your problem.

'Stringing together' several comps isn't a bad idea, i use it a lot in the upcoming CoGe 1.0.

ps.: if you have time to wait, i will release CoGe 1.0 soon (1-2 months), think it will be very helpfull for you.

.lov.'s picture
Re: Stringing together QCRenderers?

cwright wrote:
Movie Loader doesn't work off the main thread now (from what I've seen -- perhaps fixed on Snow Leopard?)

some codecs (?!) like sorenson3 only works on the main thread, it's a bit odd.

psonice's picture
Re: Stringing together QCRenderers?

You can still use NSTimer, you just have to be careful which runloop you add it to. I had this issue initially (composition freezing when I adjusted controls), changing the runloop fixed it. Which is lucky, because I have 'special' requirements for my current app that make 'fast as possible' a very bad idea :)

.lov.'s picture
Re: Stringing together QCRenderers?

psonice wrote:
You can still use NSTimer

that's true, but think using a Core Video Displaylink is better :)

sbn..'s picture
Re: Stringing together QCRenderers?

cwright wrote:
I'm assuming you're going to have just one NSOpenGLContext, and init all the qcrenderers to use that context, correct?

Sure, I think so... I'd like to pull additional images (so I can preview a single channel, for instance), but these would have lower priority. I'll look into it.

Quote:
That should work. There'll be some additional overhead (the fixed QCRenderer overhead will be multiplied), but also some savings (all the dead portions of the monolithic graph will get ignored, using smaller "modular" qtz's instead). Profile to see how this helps/hinders, as it'll depend heavily upon how the compositions are structured.

regarding threading -- you don't need to do anything unless you explicitly create additional threads. If you start some threads in the hopes that QT-related freezes will go away, please give up all hope and try writing a custom plugin instead -- Movie Loader doesn't work off the main thread now (from what I've seen -- perhaps fixed on Snow Leopard?)

If you use the QCImage opaque type for image transfers between renderers, you shouldn't ever leave vram, so that should be quick and painless.

Thanks for those points, I'm trying to accumulate enough context that I can read docs adeqately. Posts like this help tremendously, thanks.

sbn..'s picture
Re: Stringing together QCRenderers?

Thanks, I'll look forward to seeing what you come up with.

The composition freeze is sort of proportional with the complexity of the .qtz I'm loading, or the size of the .mov. I use a cocoa controller and just send along a path to a qtz. through a published string input (no prarameter view or whatever it's called). Still not sure what's going on, but I can see I have some reading to do regarding timers. Thanks!

Oh, and BTW, System Monitor shows that my app has 10-20 threads, so QC must be doing some management all of its own.

.lov.'s picture
Re: Stringing together QCRenderers?

Do you use the movie loader in asyncronus mode? I discovered this issue with turned on async mode, without async mode everything works well.

psonice's picture
Re: Stringing together QCRenderers?

It's better 90% of the time, for the other 10% it's a really bad idea ;)

.lov.'s picture
Re: Stringing together QCRenderers?

Could you explain a situation when it is a bad idea? It's not an offense, i'm just curious :)

vade's picture
Re: Stringing together QCRenderers?

I do this in my v002 app, string together multiple QCRenderers, and it works very fast as long as you follow these rules:

1) use the same context or a shared context to init all of your QCRenderers so that resource sharing works.

2) use a CVDisplayLink or other threaded playback. If you do so, however, you will have to make sure that you respect tech QA 1538: http://developer.apple.com/mac/library/qa/qa2008/qa1538.html This is easy to do however.

3) If you use a threaded CVDisplaylink you cannot use apples built in movie player. Roll your own, its not too hard. Use QTopenGLTextureContexts and pass a CVOpenGLTextureRef in as the input image to your published port. Very fast.

4) When passing images from renderer a to renderer b, use outputKey: ofType:QCImage, this will pass an optimized QC image object rather than non optimal formats, as Chris said.

5) to do away with loading/lag issues pre-load your QCCompositions and cache them. You take the hit once on start up, and you can organize your effects in to folders you know you will use, so you don't load them from scratch dynamically. You can also init all of your QCRenderers in one go too, but I found loading the comp to be ok and not laggy. Just keep them in an array and do a copy. Basically I have one master cache/NSDictionary of of sources, effects, and renderers and I pull out one that I want and copy it to the channel I need. This is just a mem copy and not re-initting everything. Ive seen very little lag with that, but then my effects are not too crazy with huge images internal to the QTZ. If you are really worried go for broke and make an array of QCRenderers off the bat. I did not feel the need so I just cache the QCCompositions.

6) I'd really stay away from, using QCview, use a custom view and manage your own GL Conexts.

7) if you are going to use a QCParameterView, make a delegate and under no circumstance allow it to show your image input ports. This will slow you down like a motherfucker. I stay away from QCParamView and roll my own to make custom ui elements, but it was a huge pain in the ass to set up. The delegate method is straight forward, I have it somewhere If you need it.

Its some work, but it pays of. I really need to go back to working on v002 and get it out there. Fuck.

.lov.'s picture
Re: Stringing together QCRenderers?

vade wrote:
3) If you use a threaded CVDisplaylink you cannot use apples built in movie player. Roll your own, its not too hard. Use QTopenGLTextureContexts and pass a CVOpenGLTextureRef in as the input image to your published port. Very fast.

I'm using a CVDisplaylink and i can use apple's built in movie player. I just using it on the main thead - UI element's blocking don't affect me, because i use my own gui elements.

vade's picture
Re: Stringing together QCRenderers?

Then you aren't using the display link to render that QCRenderer with the movie player, are you? ;) If its on the main thread, it is not using Displaylink.

Just be careful with locking contexts since you are now passing resources across threads.

.lov.'s picture
Re: Stringing together QCRenderers?

Yes, i'm an idiot :) The movie player with some codecs like photo-pjeg working with cvdisplaylink (10.5 and 10.6 too!), but the most of them don't.

psonice's picture
Re: Stringing together QCRenderers?

Whenever you don't want full framerate.

For my case, I have 2 stages. One runs some post-processing on a video stream, but the video will often be at 5fps (possibly as little as 60spf video too! ;) and running at more than the video rate will cause major problems (I use feedback loops...) In that case, I'm driving the QCRenderer directly from a QTCaptureSession though, so the NSTimer isn't involved.

The other case is where I pause the video feed and mess with the post-process settings alone. Full framerate would be great here, BUT... it's running on a laptop that's a long, long way from power. Maximum battery life is much more important than speed, so 10fps is great.

There's probably a ton more cases like this where NSTimer is better. Anything where you don't want it hammering the CPU, or where it's slow animation. Maybe you'd even want >60fps - how about running a simple .qtz at 240fps off screen and combining the frames for motionblur? :)

Like I said though, 90% of the time you want CVDisplayLink :)

.lov.'s picture
Re: Stringing together QCRenderers?

thanks for explaining it! I never needed little fps QCRenderer :)

franz's picture
Re: Stringing together QCRenderers?

_ 4) When passing images from renderer a to renderer b, use outputKey: ofType:QCImage, this will pass an optimized QC image object rather than non optimal formats, as Chris said. _

interesting, is Spooky patch using QCimage type when passing pictures to multiple renderers ?

sbn..'s picture
Re: Stringing together QCRenderers?

Thank you kindly for sharing your experiences! This list will almost certainly save me several days of experimentation.

By the way, I have the same dilemma about releasing stuff - I've been working on this performance system since Tiger came out, and I keep thinking that I should release it so others can use it or learn. It's always just a bit too kludgy, the source is too ugly, too many hacks, all sorts of excuses.

Nevertheless, I use it live, and apart from some quirks it works well. Maybe I need to work on my vanity in this regard, and just release it. At the rate it's going, I'll have something ready for release just in time for OS XI to make it obsolete.