Quartz Composer

multi-touch music box

dust's picture

so this is a comp i have been working on. it is a pretty intensive process as i am using quartz composer, unity3d, and ableton live. in future versions im pretty sure this will have to be a multi machine set up one for rendering one for audio etc..

any thought on how a 3d osc/tuio/midi node graph type of program should work. its all experimental at the moment so i'm making it up as i go along.

the basic idea is to be able to place these nodes or units i'm calling them in chains on the screen to be able to make up graph that produces usable midi data for synthesis. the quick low down is, this patch is sending the position, rotation and scale vectors as midi control messages as well as a randomized arbitrary midi note map to ableton.

this is currently in progress but if anybody is into experimental ways of making music and wants to try this. i'm happy to supply the binary quartz builder and unity files as it seems things run smoother from the binaries.

here is a more detailed explanation from vimeo...

this is an early experimental multi-touch music project, i am working on. it is inspired by the reactable and utilizes the tuio protocol to transmit cursor data to unity via xTuio. in addition to being rendered in unity, quartz composer is being used to parse the spectrums peak amplitude for each instrument that is also being feed into the unity render engine via osc to visualize the waveform between the 3d music box nodes.

there are two types of nodes that both transmit midi note and control data to ableton live for synthesis. the two type of nodes are a synth unit and fx unit. although the nodes act as two different classes they are inherently the same objects transmitting the same data to live.

meaning the fx unit nodes also can act as a synth node and utilize its midi note data and like wise the synth unit nodes also send midi control messages. the nodes are transmitting a random note in the range of 0 to 8. this note is subsequently parsed and retransmitted to live in a particular scale. in addition music box is also sending 8 harmonic notes based on transformation logic as well as transmitting the subsequent rotation, position and local scale vectors for each node as midi control message.

so what this means is you could use a two finger pinch zoom to scale the node up which in turn can be mapped to an instruments amplitude or what ever parameter you would want scale to be used for. although this video is only showing positional usage both position and rotation vectors are sent to live as midi cc. doing a pinch zoom pivot rotation will give the xyz rotational messages to subsequently map.

the fx units are chained to the synth unit nodes via a spring joint so that the position of the fx unit can modify the location and rotation of the synth node by utilizing rigid body physics. meaning that the control messages of the synth unit can also be effected by the fx unit and vice versa. this is all experimental and there is no right configuration at the moment.

GLSL shader with a seamless texture

rybotron's picture

Is there a way to create a GLSL shader that would texture a mesh with a seamless texture. Not a picture that has been made to be seamless, rather taking an image an unfolding it over the mesh so each edge mirrors on all 4 sides each time it is iterated.

Hiya from San Francisco

rybotron's picture

Hi everybody! My name is Ryan and I'm a motion graphics artist and becoming a hardcore quartz convert with the help from everyone on this forum. I discovered Kineme3D and Quartz last year and just recently started getting heavily into it. ALL of my knowledge and inspiration has come from the people in this forum. I've been secretly stalking each of your webpages, vimeo pages and forum posts and decided it's time for me to come clean and introduce myself!

I'm currently working on a live show with VDMX and Quartz and always wanting to learn more and more! Also gradually working on a set of "tools" for helping After Effects and Cinema 4d artist use quartz.

Although I've recently taken down most of my mograph work to build a new site and include my realtime stuff it will be taking shape here:

http://www.rybotron.com (mostly mostly a blog right now but will be turning into my demo site)

and facebook:

http://www.facebook.com/pages/Rybotron/123804924313783

Thanks and can't wait to chat with you guys!

Souped up Struct Environment & Structure Environment in Reverse - Obtaining all info in Macro with Structure Output Environment

Proposal: "get" objects that could get vertex/normal structure info post render/transforms/rotations, for Mesh/OpenCL based renderers, or even better, standard objects as well. In addition, other info about the state of the patches in a structure would be useful.

What could happen is that Render patches could get draped in an environment that's function is to provide a structure output with each Render patch inside of the macro being represented as a separate structure key in the output structure of the Environment patch.

The structure key names would correspond with each renderer's already existing key name (the one you see by hovering over the patch), enumerating structure by layer order of Render patches, with the post-render Vertex (and normal?) info of the Render patch enumerated nested inside of that keyed object. Again, this means the environment patch would convey the vertex and normal values post transform/rotate/scale on the renderer patch via a structure output on an environment that wraps one or more renderers.

Something like a clear, if on layer 1, would show up as index 0, with a keyname "clear", and possibly structure info that described the rest of the state. A Sprite on the next layer would be index 1, have a key of Sprite_1, info about the coordinates of each of it's corners, and perhaps all other info about the state of the patch.

By being able to easily obtain the vertex info of objects post scale/transform/rotation/render, we can create events that are dependent upon collision between objects, bar objects from touching one another, etc. In general, being able to read all info about the state of a macro via some kind of structure output on the macro just makes good sense, and it's sort of surprising that the functionality isn't there, after thinking about it - trying to do something ultra easy in another language tonight made me think of how this would allow for many different functions, and is a pretty straightforward and QC-esque way of handling it.

So, like a image broadcaster or spooky, but an environment that can hold consumers and have a struct output, that provides the scene structure with additional info provided by calcs of vertices, etc that the User gets to avoid (or have some real method of obtaining at all, really). This patch wouldn't "do anything", it would be for obtaining info.

ALSO, it would be great to be able to pass arbitrary structure data to a souped up GL Structure Environment that would work in a similar way (probably a separate patch). If wrote a keyed structure that corresponded with all of the Key names of the patches inside, as well as input key names, transmission could happen to those patches without even noodling anything up, if I plugged that structure into an input port on the Structure macro environment.

ALSO ALSO, iteration/duplication could happen inside of the environment by hooking objects that are like Iterator Variables to patch time on things, except the number of iterations would be specified on THAT patch, not the actual environment.

Perhaps one patch could perform both functions.

Different type of thought about QC and iOS...

gtoledo3's picture

There's been a great deal of talk about QC in iOS, exporting compositions/baked apps to various other platforms, etc., but seeing something that Lee posted about TeamViewer, along with reading some technical notes recently about the Opera browser on iOS (proxy based browser) got me thinking about something...

Could a qtz app be hosted and rendered on the server side of things, like a web app, so that a system viewing the graphic didn't have to have QuartzComposer framework, and also, could touch data (in the case of iOS) be forwarded via the web app and parsed so that it could do_stuff in the composition?