app architecture

Stringing together QCRenderers?

sbn..'s picture

Hello.

When doing live visuals, I use my own cocoa + QC app. This has gotten fairly unwieldy, and I'm warming up to do a rewrite from scratch. I have some examples and sources (CoGe, qcAdvanced, etc), but would like to hear the experienced devs here on a specific question.

First, the problem I have is that my main composition is this big, monolithic mess. The problems I've solved from the Cocoa side are ones that aren't easily handled inside QC, like signal routing, movie thumbnails, knobs & controls, selection of generators / effects / mixers. (Right now, the signal routing etc. has a good mix of javascript too, which I'd like to avoid in a rewrite.)

Especially the selection of "blocks" (generators, effects, mixers) is a problem. I have a number of composition importers whose paths I can change from the Cocoa UI, e.g. to change effects. Whenever I do this, the composition freezes for a few frames up to a second. Other stuff can cause this, too, like changing movie paths inside a generator.

On top of that, I'd like more preview options, and a smarter system that can turn rendering on and off as needed on specific parts. I'd like to find a method that would ideally enable channel A to hang / freeze, while channel B plays smoothly on, so the effect of a freeze isn't as jarring on the final output.

So, my question is this: Is stringing together several QCRenderers a bad idea, performance wise? By stringing together, I mean load, say, two separate qtz's. The first one, A, has a published image output, B has a published image input. The Cocoa controller then listens for changes on A's output, and forwards them to B's input.

Will this give me worse performance than keeping everything inside a single qtz? Will the image have to leave VRAM? Is there a better way to do this? Do I need to do anything specifically wrt. threading?

Any thoughts, examples or documentation links are appreciated!