unity

multi-touch music box

dust's picture

so this is a comp i have been working on. it is a pretty intensive process as i am using quartz composer, unity3d, and ableton live. in future versions im pretty sure this will have to be a multi machine set up one for rendering one for audio etc..

any thought on how a 3d osc/tuio/midi node graph type of program should work. its all experimental at the moment so i'm making it up as i go along.

the basic idea is to be able to place these nodes or units i'm calling them in chains on the screen to be able to make up graph that produces usable midi data for synthesis. the quick low down is, this patch is sending the position, rotation and scale vectors as midi control messages as well as a randomized arbitrary midi note map to ableton.

this is currently in progress but if anybody is into experimental ways of making music and wants to try this. i'm happy to supply the binary quartz builder and unity files as it seems things run smoother from the binaries.

here is a more detailed explanation from vimeo...

this is an early experimental multi-touch music project, i am working on. it is inspired by the reactable and utilizes the tuio protocol to transmit cursor data to unity via xTuio. in addition to being rendered in unity, quartz composer is being used to parse the spectrums peak amplitude for each instrument that is also being feed into the unity render engine via osc to visualize the waveform between the 3d music box nodes.

there are two types of nodes that both transmit midi note and control data to ableton live for synthesis. the two type of nodes are a synth unit and fx unit. although the nodes act as two different classes they are inherently the same objects transmitting the same data to live.

meaning the fx unit nodes also can act as a synth node and utilize its midi note data and like wise the synth unit nodes also send midi control messages. the nodes are transmitting a random note in the range of 0 to 8. this note is subsequently parsed and retransmitted to live in a particular scale. in addition music box is also sending 8 harmonic notes based on transformation logic as well as transmitting the subsequent rotation, position and local scale vectors for each node as midi control message.

so what this means is you could use a two finger pinch zoom to scale the node up which in turn can be mapped to an instruments amplitude or what ever parameter you would want scale to be used for. although this video is only showing positional usage both position and rotation vectors are sent to live as midi cc. doing a pinch zoom pivot rotation will give the xyz rotational messages to subsequently map.

the fx units are chained to the synth unit nodes via a spring joint so that the position of the fx unit can modify the location and rotation of the synth node by utilizing rigid body physics. meaning that the control messages of the synth unit can also be effected by the fx unit and vice versa. this is all experimental and there is no right configuration at the moment.