How to save your composition as video with full quality and framerate

psonice's picture

I can't believe I missed this before. I've been after a way to capture a composition and save it as video, and using the movie exporter patch or the image exporter patch just don't cut it as performance is too low. Here's how to do it the 'right way' (note that this won't work for interactive stuff mind):

  1. export your composition as a quicktime movie. Make sure you set the resolution to what you want the video saved at, and set the time to the length of the composition.

As far as I can tell, that just embeds the .qtz file in a .mov container. It has unexpected side effects though: a .mov composition will have a time slider in quicktime so you can scrub through the composition, but the same thing as .qtz doesn't. It's actually really handy to quickly check through a big composition, but the time bar doesn't effect audio, just the effects. Also, you get an fps count with .qtz, but not with .mov, for no apparent reason.

  1. Open the .mov file in quicktime. You're going to need the pro version here... because that video export function will actually convert your composition to h264 or whatever else! So go to file -> export, tell it your choice of codec and settings, set the resolution to whatever you want, and let it go.

There's a catch here - audio support seems very, very flaky. The composition will play with audio, but the audio seems detached from quicktime. The exporter won't recognise the audio track at all, so you'll get a silent video at the end, and have to add the audio track back in later.

Still, this will give you a perfect video capture. I've just exported some of my stuff to video at 60fps - it's not realtime, so feel free to crank the resolution up as high as you want =)

cwright's picture
alternatives

To get this without QT pro, you can write a simple* Cocoa app that uses a QCRenderer to render, and QTKit to encode to video. We really should get around to making an app like this; it's probably a week-end or two hack at worst. I know that I'd use it often :)

Some cool additions to consider: rendering super-high frame rates (120+ fps), and then merging frames for some kind of film-like motion blur, rendering super-high resolution (6400x4800), then downsampling to get very high-quality AA. Watch out Pixar ;)

Audio's non-integration with QC will probably cause problems for as long as it remains not integrated; Fixing this externally is really risky (if apple decides to do this later on, it'll nullify all our work), and labor intensive, so I've not put the resources into it that it needs to get off the ground.

[* simple is relative, of course... ;)]

tobyspark's picture
the key requirement for me is feeding audio into the comp

...and that may not be so trivial.

alternatively, apple could make a wrapper so we could use qc in motion, and as a motion graphics type we have the best of all worlds.

psonice's picture
Example app

There's an example app in the development stuff that does pretty much exactly that. Audio would be the only thing that'd really tempt me to do it though.

Obviously properly integrating audio into QC is a fairly big job, and without that there's no simple access to audio on export. As a half-way house though, it'd be simple enough to check through the bplist for the composition for movie loader patches set to asynchronous.

How hard would it be to check when that particular patch gets enabled, and to merge the audio stream with the QC video before encoding? You'd need to load in the movie/audio file separately and demux the audio first I guess. It'd save a fair bit of messing around mixing the audio back in after the video has encoded.

Just as a note, I suggest scanning for audio generating patches at the start and waiting for the patch to be accessed because there's likely to be cases where the audio doesn't start straight away, or where there's multiple audio files in the composition.

cwright's picture
Opacity

How hard would it be to check when that particular patch gets enabled, and to merge the audio stream with the QC video before encoding?
I suggest scanning for audio generating patches

From Apple's API, the QCComposition and QCRenderer objects are totally opaque: the user (us) is give absolutely nothing regarding patch count, which types, when they're enabled/disabled, etc. There's the plist hack you mentioned (which is actually really simple), but that's not helpful for stateful compositions (ones that dynamically enable/disable parts as they run).

We've done a bit of exploring to extract data out of these opaque types, and it's fairly possible using some simple hacks and the unofficial api. However, the data structures are subject to change, and often do -- more so than the plugin api we typically deal with.

The "correct" solution would be to add an audio context to the QCComposition/QCRenderer objects, and have the QC graph render that as well synchronously with the visuals. This is definitely a non-trivial modification (adding data to existing objects requires massive amounts of treachery). I'm not sure how well intermediate setups would measure up to this.

An alternative, more kludgey solution would be to have the Audio Output patches manage their buffers, and allow a renderer application to read them. This requires walking the QC graph (discussed above), but would give essentially the same functionality, in a less elegant — though much more possible — way.

BTW, what's the sample application that renders compositions to movies? I've not been able to find that one, but spent a bit of today working on one (probably ready in a few days? no idea really), so having a reference app would definitely accelerate development.

psonice's picture
QTKit saves QC as video for free :)

The example app I was thinking of was just the QTKit one in the quicktime folder, which does very basic quicktime exporting 'for free'.

It doesn't load QC patches directly, but export them as .mov in QC and it'll load the file just like quicktime does. Then it'll export the QC .mov to a proper video .mov. Not the best of examples, but likely as close as we'll get.

cwright's picture
cool, thanks!

Thanks for the tip! Checking it out. :)

tobyspark's picture
nope, for me its audio *in*

ie. the rendering of audio-reactive graphics for use in broadcast-esque projects.

i've already had a world of non-fun with ishowu recording custom equaliser graphics for the royal festival hall project i was involved with.

for most jobs to do with playing sound, faking it in post wouldn't be too hard i'd imagine, or outputting a midi-like control track that an audio app could then process and render down. i'm not saying its ideal, but its doable, unlike the audio-in reactive graphics issue.

psonice's picture
A catch.. and a workaround. And a request :D

There's a bit of a catch with doing this. It works fine in general (apart from audio), but if you're using any timers other than patch time, expect problems.

As an example, I was using the position output from a video loader patch to drive the composition, but also using a math patch to basically scale up the timer (if you want to synchronise to music and you know the BPM, this is a good thing to do - you can get the timer to count beats instead of seconds!). Anyway, in the video capture, my timer doesn't work and it plays back about 50% too fast.

My workaround was to add a 'patch time' patch by the video patch, and use a math expression to switch between video time and patch time based on an input called 'capture'. You might need to add an offset if the patch has any loading time, and if you use multiple videos and switch between then, good luck ;) Maybe it'll help somebody though. The math expression is just this:

(capture = 1) ? patchtime * (BPM/60) : videotime * (BPM/60)

The request then: I think it'd be really handy to be able to save and load output from the audio patches (probably just the spectrum + peak level).

That would solve the audio issues with video capture through quicktime: you'd play it realtime and capture the audio output data to a file, then change it to load the data from a file so it would feed the data back into the composition when you capture.

Does that sound like a good (and hopefully quick + straightforward) way to handle it?

tobyspark's picture
hmm, stream structure to disk

aka "structure dump". writing such a thing would be relatively trivial, but thinking about fluctuating framerates makes this not a good idea imho. possibly the best approach you could make an app that bakes an audio file down to timestamped analysis info in structure form with a reader that could be swapped out with the audio analysis patch, and find and output in the nearest timestamp on execute.

cwright's picture
partial rescue

There's a Structure Load/Save patch in the FileTools (??) beta, with which you could save an individual structure. If you generated a unique file name each frame, you could save them all, and then play them back -- however, the recording would be quantized to the recorded framerate, so you'd have to interpolate the loaded values or something. All in all, this is a lot lot lot of work to do in the composition.

I've thought about the "record data" problem a few times (especially over time, as this patch would need to handle) -- I don't think there's a real solution to it yet, but it would definitely solve this problem in a way much easier than implementing audio in QC ourselves :) Or perhaps the QT framerate is different from what you're expecting (i.e. not 60?).

The video timer problem sounds like a bug in QT -- you might want to file a bug report on that. As long as you're not using system time anywhere (patch time only), you should get valid output each time. what happens if you make a composition that does patchtime -> image with string -> billboard, and then record it? does it count time appropriately?

[edit: toby was too fast :) I like his audio-bake solution much better than the record/interpolate stuff. With that, you could also create "high-def" audio stuff with more frequency bands than the built-in audio input patch.]

psonice's picture
patchtime = fine

Patchtime works OK, my workaround is to use that instead of the time from the video. And I'm not sure if the video issue is a bug or not, I think it works OK when playing in QT, but when you export it to video does so without audio, so the lack of audio data makes sense.

Re. the streaming structure to disk thing... actually, I think the fact that it's frame-rate dependent is a good thing. If you're setting the composition up and only getting 15fps, but capturing it you get 60fps, you'd have 4x more audio data, and that would actually change the effects in the video. It could be messy if you had volume peaks between frames.

If you captured the data dependent on frame rate, you'd at least get when you saw, or close, although I think you'd need the option to interpolate on playback to avoid getting say 10 fps effects on an animation running at 60. You'd need to capture the patch time alongside the structure data to get it right of course.

As another alternative, a less complicated method than baking the audio data out to structures, how about an 'audio sampler' patch? It would sample an audio file each frame, and just provide that data to the audio processor (although I think a modified audio processor patch would be needed to take data from outside the audio system). That would also cure all of the audio issues.

toneburst's picture
Saw this the other day

and a little bell rang. Not quite what you're talking about, but interesting, nonetheless.

http://tonfilm.blogspot.com/2007/02/hd-video-rendering-with-realtime-fft...

alx

Quartz Composer Blog: http://machinesdontcare.wordpress.com

Music Site: http://www.toneburst.net

oloool's picture
Render out of Quartz Composer

Hello,

i'd export (&render) successful any QCompo in Mac OS X 10.3. (Export as .mov out of QC and then export out of QT in any encoded video.)

Maybe, this is a relevant hint for your work?

In 10.4 and 10.5 i have the problem not to render/encode video with data read out from extern xml-files or images from a local folder. Maybe this is a security-feature by Apple?

in hope for a workaround..., regards, ol.

psonice's picture
external xml = ok

Ol,

Not sure what you mean in the first sentence. I thought QC was not supported at all until 10.4?

Anyway, using external xml files works fine, I used that a few times in my demo and exported it as video with no issues. Make sure you use the full path to the xml file though. I'm not sure what happens exactly with quicktime, but within QC you can just use "myfile.xml" as the path and if it's in the same place as your composition it will load. In an application (and probably quicktime is the same), that won't work, you would have to use "~/Documents/Compositions/myfile.xml" (or whatever your path is).

oloool's picture
It works ;-)

hello psonice,

yes, now it works. Since i connect my xml via a local file path (~Documents/...) it works. So easy – thank U. It doesn't work over the http-protokoll in qt-player.

I googled the world for this simple solution yesterday,...

regards, ol

cwright's picture
Comp Path in QT

I'm not sure what happens exactly with quicktime

Within the QC Environment, there's a global dictionary that has a bunch of configuration settings. One of these settings is the "Composition Directory", which is where the composition is located on-disk. When you type a path into QC, it does the following:

  1. If the first character is /, the path is interpreted as absolute
  2. If the first character is ~, the path is interpreted as user-directory relative
  3. If the above two fail, pull the composition's path from the dictionary, and make the input relative to that.

So it normally works as you'd expect. However, I've not really seen any other applications that correctly set up this dictionary, so patches that need path info fail the first two steps, and then can't use the third, so they default to something like "/" which is almost never correct. I'm not sure why the QC framework doesn't handle this automatically most of the time (you feed it a composition path, so it could know from there automatically. If you feed it data, I could understand it not knowing though)... maybe a future version will address this.

oloool's picture
QC before 10.4:

Not sure what you mean in the first sentence. I thought QC was not supported at all until 10.4?

Oh, i thought Quartz were introduced with 10.2, Jaguar? Not sure, maybe i'm confused with the upgrade of quicktime 7 from quicktime 6. I could remember i'd to install 10.3 Panther to render my titlegenerator.qtz for use with FCP.

This titlegenerator were connected to a xml-file over my local MAMP-server. I didn't have the idea to connect the xml via a local file path or it was just not the right way...

regards, ol.

cwright's picture
chronology

Quartz (not composer) was introduced in 10.2. Quartz Composer, on the other hand, is a separate tool (that makes use of Quartz, among many other technologies) that was made available in 10.4. I am not aware of any backports Quartz Composer to 10.3 or earlier, but perhaps parts that were built into Quick Time were 'inadvertently' ported to 10.3? I don't know, I've never worked on a 10.3 system.

hekxsa's picture
previous to that

Pixelshox became QC in tiger, there was some really neat stuff done with it too

http://www.pol-online.net/pixelshox/

I remember giving it a spun to check out old projects in tiger, but if you need to/think it's worth the trouble to port 'down' some things 10.3, have a look at that.

=========

a problem: As discussed here I have prepared a composition, an RSS reader in fact and... 1. exported .mov from qc 2. exported this .mov to .mov (encoding to video) in quicktime pro - takes some time you know

everything worked fine, got a clip, and went on refining the layout,

repeated the cycle and to my horror the three main elements - loaded video-loops ceased to show - instead I get empty sprites and quicktime crashes after a short run - it is as if the files were completely lost and yes the paths are absolute, both for videos and the RSS.

the question is What am I missing? the composition in qc works just fine, i swear i did not touch the parts that display the missing video, qc suddenly started exporting broken .movis.

a magic setting that i missed? is it a complexity issue? should i resolve the movie address closer to the root instead the leaves of the composition tree?

' am i alone in this '

well, the same in pictures

cheers hekxsa

PreviewAttachmentSize
disaster.png
disaster.png299.86 KB
thegoal.png
thegoal.png271.87 KB