how to offline or non-realtime render an iTunes Visualizer

boyfriendacademy's picture

I'm trying to turn some of my original songs and remixes into music videos by using music visualizers.

The screen capture method using Snapz Pro X to record an iTunes Visualizer isn't producing great results for me. I've tried tweaking the settings but my trusty old MacBook Pro just isn't capable enough to both simultaneously render and record 30fps 1920x1080. Here's the best I was able to achieve, at 1280x720:

So I started looking into offline or non-realtime solutions and came across Kineme Audio Tools. I've looked over the examples but I haven't been able to figure out how to take a Quartz Composition that's designed to be a realtime iTunes Visualizer (for example, the Jelly visualizer that comes with iTunes) and convert it to a non-realtime visualizer using the Kineme Audio File Input. Can it be done? I'm not a QC expert, so any help would be greatly appreciated.

And instead of using Quartz Crystal, I was thinking about trying the Quicktime Pro render method as described here: just because it seems easier.

I don't necessarily need the audio track included the rendered file. I can add the audio later through Final Cut Pro.

Pursuing a non-QC route, I was able to find a cool offline audio reactive visualizer done in Processing by visual artist Pedro Cruz that rendered a sequence of PNG images that could then be combined into a video using Quicktime Pro "Open Image Sequence". This method allowed me to create a beautiful 1920x1080 video at 30fps. Because it's highly-detailed, I rendered and edited it using the ProRes 4444 codec. Unfortunately, YouTube's h264 transcoding doesn't really do it justice, but you can get the idea:

Any QC or Processing artists out there that would like to collaborate with me on more music video projects like this? Right now I have about 5 more tracks I'd like to turn into videos.

Quartz Crystal

so im playing around with the new crystal today and came up with some interesting ideas. i would like to be able to have some external control of crystal.

lets say i have a directory scanner on my server watching an uploads folder. when a new video is uploaded it would be great if there was someway to tell crystal from qc to start rendering "this" composition meaning the open comp with directory scanner.

i'm not sure if some apple script dictionary or network type of tools could be made for crystal but i think it would be a cool feature to have. maybe a crystal plugin that saves the open comp to another name then offline renders the saved comp from the open comp ?

not sure if i explained my self right or not. just think vimeo..... your video is waiting to be processed. that way you could apply effectsor embed water marks to uploaded content etc.. stuff like that. ;)