Quartz

Importing LR pulse from Ableton Live 9.5 into Quartz Composer

1010juju's picture

Hello everyone,

I am wondering how to:

1) translate the pulse noise/sound on Ableton Live 9.5 (left and right panning) into two, separate flashing black and white screens (flashing from one to the other in turns) on the left and right sides of the screen; and

2) import the pulse noise/sound on Ableton directly into Quartz composer.

I hope I explained it enough - it is a bit difficult to describe in words... Basically, I would like the pulse noise to be translated into a simple visual representation in black and white, one on the left and the other on the right, but with "movement" represented by the flashing.

Thank you all for your time!

Muneomi, from Tokyo

Trigger on Application Close

Is there a way to have quartz perform an action before quitting? An On-Close patch would be really useful, especially for closing a file using File Tools. Thanks! I bought Vuo too, and didn't see a way to do it there either. Any ideas?

Finding new Quartz

scalf's picture

Hello,

Has anyone tried getting a new distribution of Quartz Composer from the Apple Developer site lately?

https://developer.apple.com/library/Mac/navigation/

I have not, maybe I am missing something (like it's in a different place) but I swear it used to be here I just can't seem to find it... Hope it didn't get shuffled away.

Thanks for any tips.

https://developer.apple.com/library/Mac/navigation/

Does Mavericks support QC?

scalf's picture

Hello,

Has anyone heard of any recent, definitive, information on whether the new Mac OS, "Mavericks", will support Quartz Composer or not?

If not, has anyone some great creative tricks to keep QC running in a split way, partition or otherwise?

Thanks

Quartz Composer + Ableton + QLab

qewrty's picture

Hi,

So I have got a project that I am hoping i can get some setup ideas form you guys. I am open to any suggestion as i am at the drawing board stage at the moment.

So the idea is have lets say a multiple amount of either go pro's or replay xd cameras on stage plug them into some sort of matrix and then into QLab that then can be triggered from ableton individually via midi or jump to a different cue where cameras can be in a different configuration.

QLab then sends final output to Quartz where the final effects and processing are applied. The final image is sent to a projector or to a hippo media server for mapping.

So some thoughts on why this setup. I have read a couple of posts saying that quartz doesn't really handle multiple camera inputs that well, so hence the use of qlab. The reason for go pro's or replayxds is that they are small and can be easily hidden in a set on stage.

Can you please give some suggestions if you think this is a doable setup or something can be cut or needs to be added to this.

Thanks.