Getting iTunes composition working

nickharambee's picture


The attached composition has been revised from a composition I used to use with iVisualize, to generate a visualisation in iTunes. I have needed to use the new track info patch to display the track info in the visualisation as iVisualize no longer works with iTunes. The new patch works fine, but the part of the composition that generated a slideshow from images found in the album folder isn’t working, and I’m not sure what aspect of the composition needs changing. I had some help with the previous composition and it was designed a long time ago, so I've forgotten exactly how it all fits together.

Essentially the composition is designed to count the number of images in the folder and then display each image as the song plays, e.g. if there are 4 images and the track is 3 minutes long then each image should be displayed for 45 seconds with a fade between images. For some reason no images are showing. I can’t work out where the track length is calculated - this can be extracted from the track info patch, but isn’t linked up in the composition, so I wonder if this is the issue. There doesn’t seem to be a way of testing live in QC with the track info, or at least I haven’t found one, so troubleshooting is difficult.

Any help to get this working would be much appreciated.



how to offline or non-realtime render an iTunes Visualizer

boyfriendacademy's picture

I'm trying to turn some of my original songs and remixes into music videos by using music visualizers.

The screen capture method using Snapz Pro X to record an iTunes Visualizer isn't producing great results for me. I've tried tweaking the settings but my trusty old MacBook Pro just isn't capable enough to both simultaneously render and record 30fps 1920x1080. Here's the best I was able to achieve, at 1280x720:

So I started looking into offline or non-realtime solutions and came across Kineme Audio Tools. I've looked over the examples but I haven't been able to figure out how to take a Quartz Composition that's designed to be a realtime iTunes Visualizer (for example, the Jelly visualizer that comes with iTunes) and convert it to a non-realtime visualizer using the Kineme Audio File Input. Can it be done? I'm not a QC expert, so any help would be greatly appreciated.

And instead of using Quartz Crystal, I was thinking about trying the Quicktime Pro render method as described here: just because it seems easier.

I don't necessarily need the audio track included the rendered file. I can add the audio later through Final Cut Pro.

Pursuing a non-QC route, I was able to find a cool offline audio reactive visualizer done in Processing by visual artist Pedro Cruz that rendered a sequence of PNG images that could then be combined into a video using Quicktime Pro "Open Image Sequence". This method allowed me to create a beautiful 1920x1080 video at 30fps. Because it's highly-detailed, I rendered and edited it using the ProRes 4444 codec. Unfortunately, YouTube's h264 transcoding doesn't really do it justice, but you can get the idea:

Any QC or Processing artists out there that would like to collaborate with me on more music video projects like this? Right now I have about 5 more tracks I'd like to turn into videos.

RileyFlow (Composition by cybero)

Author: cybero
License: Public Domain
Date: 2011.07.26
Compatibility: 10.6, 10.7
Required plugins:
Image_Rehab Virtual Macro

A purely Core Image kernel based iTunes visualizer.

Inspired by the 60s OpArt of Bridget Riley.

Employs Image Rehab , thus not 10.5 compatible.

Public Domain | Free Document Licence

iTunes Extended Protocal Inputs

If you look at the iTunes Classic Visualizer, you'll notice that the waveform it uses has much more than 16 pieces of information; it more closely resembles the Kineme Audio Input. I was wondering, is there any way I can access this information?

Looking back before Quartz Composer was publicly integrated into iTunes, the program iVisualize ( claimed that the following variables were accessible:

images: albumArt, volumeArt, spectrumArt, spectrumLineArt, volumeLineArt, volumeArtL, spectrumArtL, spectrumLineArtL, volumeLineArtL, volumeArtR, spectrumArtR, spectrumLineArtR, volumeLineArtR

structure (an array of 12 elements): spectrum, spectrumL, spectrumR

strings: name, artist, album, fileName, genre, kind, comments, composer

numbers: elapsedTime, volumePeak, volumePeakL, volumePeakR, trackNumber, numTracks, year, soundVolumeAdjustment, totalTimeInMS, startTimeInMS, stopTimeInMS, sizeInBytes, sampleRateFixed, fileType, date, userRating, discNumber, numDiscs, playCount, lastPlayDate, beatsPerMinute, spectrumSum, spectrumAverage, spectrumVariance, spectrumSumL, spectrumAverageL, spectrumVarianceL, spectrumSumR, spectrumAverageR, spectrumVarianceR

boolean: isPlaying, isCompilationTrack, isFullscreen, isDemoMode (always false)

Stream Infos (all strings): version, streamTitle, streamURL, streamMessage

QuartzGL in iTunes

cybero's picture

This question was in some slight fashion prompted by postings about booting into a 64 bit kernel.

Of course not all applications are 32 / 64 bit switchable and iTunes [9.2.1][4] is actually meant to be usable on 10.4 upwards, so no surprise I guess as to its not having been made switchable at all.

It does have one rather interesting setting, unflagged no doubt due to the application not being intended to be run solely upon machines that are Quartz Extreme capable [some Macs that can run 10.4 don't have Quartz Extreme capable cards as they where produced before that particular specification came out]

In the iTunes application's package contents one can find the Info.plist.

It has a setting.

QuartzGL enable, as previously stated, by default it is unflagged.

I'm just beginning to run iTunes with this flagged in a re-edited Info.plist file that I authenticated into the application's package contents folder, so I shall post and advise.

Just wondered if anyone else had seen this. & also what I have been missing by not having QuartzGL enabled. Thus far I haven't found much difference to speak off.