Quartz Composer memory leak on Lion ?

woody3549's picture

Hello,

We current have been facing 2 problems with Quartz Composer. I don't i 2 separated discussion threads are needed so here there are:

  1. The first one was something like 'ghost' images appearing in a composition frame where images from a QuickTime source movie were rendered. The rest of the composition is made of still images representing banners, logos, illustration images, etc.. . The QC composition is quite complex and the total number of rendering patches (Sprites or Billboards) reaches 15 and even more (more than 15 levels of overlayed images). We use the QC composition in a Cocoa application with a QCRenderer initialized 'off-screen', and we write images (NSImage objects) from the source movie (via a QTMovie object) on a published image input of the composition. Then we call the snapshotImage: method of the QCRenderer to get ouput images from the composition and put them in an ouput QTMovie. As explained, we sometimes (not always) have 'ghost' images appearing in the output movie, in the frame where the source video is rendered. This ghost images randomly and sparsely occur (3 or 4 times per minutes of generated video), and they correspond to other images used in the composition (banners, logos,..), always appearing in their original side, color, orientation, etc., so they may correspond to input image buffers, substituing an output buffer... . Furthermore, these ghost images usually appear in groups of 3 or 4, each separated from the following one with a single 'expected' source movie image. We had no memory leak issue with this setup, only 'ghost' images

  2. The second problem was the one explained by Boris: to by-pass the first problem of 'ghost' images, we tried to reduce the number of rendering modules (sprites and/or billboards) in the composition and using Core Image filters embeded in the composition to do the image compositing (image overlay). We tried 2 things for that : 1) simply using a CISourceOverCompositing filter in the 'edit filter function' of the Core Image Quartz Composer patch. 2) coding the following Core Image kernel in the CI QC patch :

kernel vec4 coreImageKernel(sampler bckImage, sampler topImage)
{
    vec4 bckPix, topPix; 
    topPix = sample(topImage, samplerCoord(topImage));
    bckPix = sample(bckImage, samplerCoord(bckImage));
 
    return topPix + (1.0 - topPix.a) * bckPix;
}

Both of these solutions did the overlay work correctly but both lead to a big memory leak, in particular with HD source movie images, rendered in HD ouput images: in this case the 24 GBytes of our MacPro was eaten in a few tens of seconds, while our Cocoa application doesn't use more than 1,5GBytes of memory (according to the activity monitor of the Mac).

My hypothesis is that Quartz Composer actually does the image compositing in GPU when using the rendering patches like sprites and billboards, but that Core Image filters embeded in a QC composition does not, thus working in RAM and causing the observed memory leak.

I have no idea now how to do our compositing work with these to issue, wich really look like bugs in Quartz Composer...

Any suggestion would be appreciated. Thanks

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

cybero's picture
Re: Quartz Composer memory leak on Lion ?

Core Image uses the GPU or if not available, then the CPU. That is my understanding of Core Image's operations, including compositing. See - http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Co....

Regards the CI Kernel you posted up, that seems to me to operate, at least for me, in a somewhat unreliable fashion, though I'm not currently sure as to why.

Often as not it will initially composite both images with a grey colour background tint to what was a set of coloured lines upon a white background. [compositedornot1.png]

Then, after a stop and restart, it will composite both but with a background colour, usually red, regardless of image type, whether HD movie or image. [compositedornot4.png]

The other key distinguishing factor is that the Profile monitor shows next to no activity when the CI Filter Kernel is working with no discernible colour alteration, but when it goes into the red it really starts to peak out.

Puzzling.

PreviewAttachmentSize
compositedornot4.png
compositedornot4.png17.27 KB
compositedornot1.png
compositedornot1.png17.12 KB

gtoledo3's picture
Re: Quartz Composer memory leak on Lion ?

Core Image is a hodge podge. As far as I know, some stock "core image" filters use glsl shaders behind the scenes, some use openCL, and some may even use other stuff (?).

It's just as likely that you have a driver leak when using a certain kind of process, or are seeing a leak with something particular to the QC editor environment (tooltips, image metadata, data conversion, or something else).

Certain CI filters may in fact falling back on the CPU route on your machine, but that alone isn't reason or cause for leaking, and using the QuartzComposer editor app doesn't cause fallback to happen unless you set it to do so using the private preferences.

The QC Editor app does have leaks, and in more places than just this. It also renders TONS of bezier curves, textures, allows you to jit code on the fly, etc., and one should probably never attempt to have that going on while rendering live graphics, other than for the use of editing the evaluation graph, even if the editor app didn't have a single leak. A qtz is best thought of as a resource to use in your own app, where you can then truly profile for leaks.

Have you tested the core image kernel when implemented as an Image Unit, or in a context other than QC?

My main suggestion is to file bugs as you see fit, but also not to think of the QC app as a performance environment in and of itself, but a tool for making evaluation graphs or hosting plugin modules, that you then use in your own app. I'd also suggest you test the qtz the context of your own app for leaks (sounds like you did that), and also try testing the core image kernel in some context that doesn't rely upon QC, because in this case, it may very well be some core image related bug that isn't necessarily "the fault" of QC.

(My gut is that the leak could be happening because of conversion of image types not happening correctly/some improper implementation of the code that allows CI filters to populate the QC patch library and be available as "patches"/some problem that happens when the system has to detect which route to take (gpu vs. cpu)/ a problem with image metadata getting screwed up somewhere in the chain/or a bad gpu driver leak/ or a problem with splitters leaking in some scenarios, all of which may or may not occur in the context of your app. You might also experiment with fiddling with codecs, or rendering the video to billboard/Render In Image at the very front or end of the chain, and fiddling with the bit depth/color correction. )