qcrenderer

Issues with the "Movie Importer" patch while rendering with QCRenderer

Marxon13's picture

I am writing a program for OS X Lion that is basically a video mixer. I create a QCRenderer with my patch using initOffScreenWithSize:colorSpace:composition:. I then programmatically pass video from a camera and a path to a movie file to two separate inputs of the patch. The movie file is then loaded by a "Movie Importer" patch. The problem is that the movie importer returns an error and no movie.

The error is: *** Message from : ERROR: Running movie on thread 0x109106000 while initialized on thread 0x1053b5000

This message appears upon processing most of the frames, sometimes it works. This is what I'm doing to get an image from the QCRenderer.

[offscreenQCRenderer renderAtTime:[NSDate timeIntervalSinceReferenceDate] arguments:nil]; NSImage *frame = [offscreenQCRenderer createSnapshotImageOfType:@"NSImage"];

Inside the patch I the "Movie Importer" patch time is set to external, and I pass it a modified patch time; looping the overall patch time with a "Round" patch to the length of the movie.

Any help with fixing this is greatly appreciated.

Stringing together QCRenderers?

sbn..'s picture

Hello.

When doing live visuals, I use my own cocoa + QC app. This has gotten fairly unwieldy, and I'm warming up to do a rewrite from scratch. I have some examples and sources (CoGe, qcAdvanced, etc), but would like to hear the experienced devs here on a specific question.

First, the problem I have is that my main composition is this big, monolithic mess. The problems I've solved from the Cocoa side are ones that aren't easily handled inside QC, like signal routing, movie thumbnails, knobs & controls, selection of generators / effects / mixers. (Right now, the signal routing etc. has a good mix of javascript too, which I'd like to avoid in a rewrite.)

Especially the selection of "blocks" (generators, effects, mixers) is a problem. I have a number of composition importers whose paths I can change from the Cocoa UI, e.g. to change effects. Whenever I do this, the composition freezes for a few frames up to a second. Other stuff can cause this, too, like changing movie paths inside a generator.

On top of that, I'd like more preview options, and a smarter system that can turn rendering on and off as needed on specific parts. I'd like to find a method that would ideally enable channel A to hang / freeze, while channel B plays smoothly on, so the effect of a freeze isn't as jarring on the final output.

So, my question is this: Is stringing together several QCRenderers a bad idea, performance wise? By stringing together, I mean load, say, two separate qtz's. The first one, A, has a published image output, B has a published image input. The Cocoa controller then listens for changes on A's output, and forwards them to B's input.

Will this give me worse performance than keeping everything inside a single qtz? Will the image have to leave VRAM? Is there a better way to do this? Do I need to do anything specifically wrt. threading?

Any thoughts, examples or documentation links are appreciated!