how to offline or non-realtime render an iTunes Visualizer

boyfriendacademy's picture

I'm trying to turn some of my original songs and remixes into music videos by using music visualizers.

The screen capture method using Snapz Pro X to record an iTunes Visualizer isn't producing great results for me. I've tried tweaking the settings but my trusty old MacBook Pro just isn't capable enough to both simultaneously render and record 30fps 1920x1080. Here's the best I was able to achieve, at 1280x720: http://www.youtube.com/watch?v=I2a_TxquY-4.

So I started looking into offline or non-realtime solutions and came across Kineme Audio Tools. I've looked over the examples but I haven't been able to figure out how to take a Quartz Composition that's designed to be a realtime iTunes Visualizer (for example, the Jelly visualizer that comes with iTunes) and convert it to a non-realtime visualizer using the Kineme Audio File Input. Can it be done? I'm not a QC expert, so any help would be greatly appreciated.

And instead of using Quartz Crystal, I was thinking about trying the Quicktime Pro render method as described here: http://www.udart.dk/2009/02/25/rendering-quartz-composer-compositions/ just because it seems easier.

I don't necessarily need the audio track included the rendered file. I can add the audio later through Final Cut Pro.

Pursuing a non-QC route, I was able to find a cool offline audio reactive visualizer done in Processing by visual artist Pedro Cruz that rendered a sequence of PNG images that could then be combined into a video using Quicktime Pro "Open Image Sequence". This method allowed me to create a beautiful 1920x1080 video at 30fps. Because it's highly-detailed, I rendered and edited it using the ProRes 4444 codec. Unfortunately, YouTube's h264 transcoding doesn't really do it justice, but you can get the idea: http://www.youtube.com/watch?v=XRld-qheX5w.

Any QC or Processing artists out there that would like to collaborate with me on more music video projects like this? Right now I have about 5 more tracks I'd like to turn into videos.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

destroythings's picture
Re: how to offline or non-realtime render an iTunes Visualizer

Add a Syphon server to you compositions then use Syphon recorder to record them at your chosen resolution.

http://syphon.v002.info/

http://syphon.v002.info/recorder/

https://vimeo.com/14533128

boyfriendacademy's picture
Re: how to offline or non-realtime render an iTunes Visualizer

Thanks JW. Syphon is interesting. But I don't see how that helps me with a non-realtime render. Looks like Syphon Recorder still requires a realtime playback of the visualizer?

gtoledo3's picture
Re: how to offline or non-realtime render an iTunes Visualizer

Most of your "problems" are going to result from this:

The stock apple audio patch has a "spectrum" structure that it outputs, that's fundamentally different from the waveform, volume peaks, and frequency info that the kineme "audio file input" patch delivers (the one you want to use for offline stuff).

My suggestion is this; rather than think about retrofitting the stock "apple audio visualizers" with the kineme audio file input patch, look at the kineme audio patch sample compositions first.

http://kineme.net/release/AudioTools/10

Download those sample compositions, and look through how the data is drawn. Checkout the "audio frequency.qtz" and hit the spacebar to flip through the different settings. Checkout something like the actual "audio file input.qtz" composition and how it uses the built in waveform image output to clever effect with some accumulator feedback. Start riffing on those ideas, figure out what's going on, and you'll be able to better understand how to "retrofit" Apple's visualizer's should you see fit.

The stock "music visualizer" template.qtz (from Apple) has a javascript patch inside of it for processing the increasing and decreasing peak of a "single lane" structure.

You can also use this with the kineme audio patches if you wish to smooth their structures, if desired. If you're trying to deal with a multi channel output from the kineme audio tools, you may need to use a "structure index patch" to focus on which channel you need to process (like, the left side, right side, or a given channel if data is coming from a multi-channel audio interface in live scenario).

So, I'd suggest using the audio file input patch, and also placing an audio file player patch set to trigger on QC's editor, so that you can have your music playing in the background as you design the animation. If you work around that premise, even if the visual gets laggy and into non-realtime territory, you can have a pretty good idea what's up. Minimize the size of your Viewer as well when aiming for stuff that you know isn't going to ever have to be "real-time", and then render at whatever resolution and fps you need. I'd suggest using the png codec, maximum quality, and then running through whatever other compressed codec (like H264, etc) you need to for final delivery after you do the full quality render.

dust's picture
Re: how to offline or non-realtime render an iTunes Visualizer

you can get all the tools needed here to do a nice offline render with kineme audio, quartz crystal, and kineme's value historian patch.

step 1: add a file player and file input patch to get your audio spectrum and to hear the audio. step 2: do something with the peak amplitude audio spectrum or waveforms. step 3: once you have your visualizer designed, record your peak amplitude or spectrum data into the value historian patch, then save the data as a plist file to disk. step 4: now use the data you recorded to drive your visualizer by setting the historian to read from disk instead of using the audio patches. step 5: offline render with quartz crystal and your patch will read the data from the file instead of using the real-time audio data. step 6: put the audio and video together using final cut.

gtoledo3's picture
Re: how to offline or non-realtime render an iTunes Visualizer

Why does the value historian need to be used? The kineme audio file input works offline.

dust's picture
Re: how to offline or non-realtime render an iTunes Visualizer

oh no kidding thats awesome. cybero had suggested using the value historian some time ago and i have been doing that ever since. he must have been talking about using it to record real-time audio things like from the mic and not from a file. there is also some real time controls built-into quartz crystal now that I'm not really familiar with but you might want to look into it for your offline rendering. also you could build your own with a qc composition renderer class but that really isn't a plugin-in play drag n drop type of solution.

gtoledo3's picture
Re: how to offline or non-realtime render an iTunes Visualizer

Yeah, you'd pretty much only need the value historian if you wanted to grab the data from the stock Apple patch.

I guess the best thing for that approach would be to play through the song, recording that data with Value Historian, with maybe even just a single billboard firing off - the bare minimum to jog evaluation. That way Value Historian could record every possible frame of data without any "drops" because of some kind of intense graphic being rendered. (Just thinking "aloud" here.)

I'm totally unfamiliar with the quartz crystal value historian-like recording process, so maybe there is something to be had with that, I never "upgraded"...it seemed weird to have to I guess, and I'd had bugs with the o.g that were never resolved.

cybero's picture
Re: how to offline or non-realtime render an iTunes Visualizer

Quartz Crystal records only keyboard, mouse scroll and mouse pointer events at present.