Processing

how to offline or non-realtime render an iTunes Visualizer

boyfriendacademy's picture

I'm trying to turn some of my original songs and remixes into music videos by using music visualizers.

The screen capture method using Snapz Pro X to record an iTunes Visualizer isn't producing great results for me. I've tried tweaking the settings but my trusty old MacBook Pro just isn't capable enough to both simultaneously render and record 30fps 1920x1080. Here's the best I was able to achieve, at 1280x720: http://www.youtube.com/watch?v=I2a_TxquY-4.

So I started looking into offline or non-realtime solutions and came across Kineme Audio Tools. I've looked over the examples but I haven't been able to figure out how to take a Quartz Composition that's designed to be a realtime iTunes Visualizer (for example, the Jelly visualizer that comes with iTunes) and convert it to a non-realtime visualizer using the Kineme Audio File Input. Can it be done? I'm not a QC expert, so any help would be greatly appreciated.

And instead of using Quartz Crystal, I was thinking about trying the Quicktime Pro render method as described here: http://www.udart.dk/2009/02/25/rendering-quartz-composer-compositions/ just because it seems easier.

I don't necessarily need the audio track included the rendered file. I can add the audio later through Final Cut Pro.

Pursuing a non-QC route, I was able to find a cool offline audio reactive visualizer done in Processing by visual artist Pedro Cruz that rendered a sequence of PNG images that could then be combined into a video using Quicktime Pro "Open Image Sequence". This method allowed me to create a beautiful 1920x1080 video at 30fps. Because it's highly-detailed, I rendered and edited it using the ProRes 4444 codec. Unfortunately, YouTube's h264 transcoding doesn't really do it justice, but you can get the idea: http://www.youtube.com/watch?v=XRld-qheX5w.

Any QC or Processing artists out there that would like to collaborate with me on more music video projects like this? Right now I have about 5 more tracks I'd like to turn into videos.

JavaQCView (Composition by dust)

Author: dust
License: MIT
Date: 2011.01.16
Compatibility: 10.6
Categories:
Required plugins:
(none)

this isn't really a composition its basically a java jni lib to render quartz composer in java. eventually i will have the messaging system working so you can send messages back and forth from qc to java....

right now it just loads a qc comp from the repo and displays it in java. the principles are derived from a deprecated apple dev example "qc cocoa component" circa 2005.

even though this builds a binary java app, i don't see why the lib couldn't be used in processing to display a qc comp.

don't know if the helps anybody but i certainly like being able to load a qc comp in java. the next step is to get published inputs and outputs working.

Cool OpenProcessing Sketch "Schizzo 2" (and 1)

gtoledo3's picture

I wanted to post this sketch b/c I thought it's pretty awesome looking. It could be done in QC, but isn't really the type of stuff that one sees people trying out:

OG Link: http://www.openprocessing.org/visuals/?visualID=12878

Idea For Discussion - Offscreen Window Grab App Launcher & Embedment /Menu Bar Interrogator Super Hack Provider Plugin

This is the concept...

A QC patch launches a arbitrary app from file path input, in a viewable secondary QC Editor app window... in the Setting panel of the patch! One can control x/y pixel width and height, as well as offset translations somewhere in all of this, similar to the v002 screen capture patch. During all of this, the app is only viewable in this largish settings panel, or perhaps a bigger GUI hack, that launches a unique window.

One autoconfigures controls for the patch in a way similar to a composition loader. After "configuring" all of the parameters of said app would be available via the app's menubar are available as inputs or outputs (if applicable) to the QC patch.

One can minimize of close this app Viewer window, and the QC provider patch, would still output the image. The Viewer window is just for the luxury of seeing the actual window of the app one will eventually be "offscreen grabbing" from.

The app file itself, should be able to be embedded in the composition.

Now, the idea, is that it would be a universal way of getting visual input for other apps "into" QC, without having to worry about doing a screen grab in a traditional sense, or having to make sure that windows always open in the same place so that the image can get piped into QC (via something like v002 screengrab). If the menu bar, or perhaps other controls, of an app can be used to dynamically configure input/ouput ports on a QC patch, it would provide a modest level of interactivity as well.

I see this as a provider type of patch, that would be able to be plugged into any standard QC renderer. It would be really useful for launching Processing app windows, and grabbing the visual result into QC, or maybe things like Ogre, or various gaming engines.

I'm going to label this as "semi-started", since a few of the hurdles are already sort of done...

Tendrils

jeff.clermont's picture

Hi,

I'm trying to reproduce this effect in QC, without the drawing part (meaning, only the behavior of the line, once it's on the screen): http://www.cs.princeton.edu/~traer/tendrils/

Does anyone knows where should I start ? Is it possible ? The only thing I know is that it involves a spring algorithms and particles.

Thanks !