Ableton Live

Midi note off with midi notes receiver?

RexTheRunt's picture

Hi there

I hope someone can help me, I"m stuck.

I'm using midi notes receiver to get notes in from Live, turning an image on and off. All is well when I use the keyboard to generate a note. The image goes on and off.

However, I'm trying to achieve the same thing via a M4L patch. It sends a note on message, which works for the sound in Live because once the note has played, it stops. But the image is still "on", until I activate it again.

I'm not sure if there's a simple method to fix this in QC. I've tried some of the toggle patches, but I"m kind of in the dark.

Does this make any sense, and can anyone shed some light?

Thanks!

multi-touch music box

dust's picture

so this is a comp i have been working on. it is a pretty intensive process as i am using quartz composer, unity3d, and ableton live. in future versions im pretty sure this will have to be a multi machine set up one for rendering one for audio etc..

any thought on how a 3d osc/tuio/midi node graph type of program should work. its all experimental at the moment so i'm making it up as i go along.

the basic idea is to be able to place these nodes or units i'm calling them in chains on the screen to be able to make up graph that produces usable midi data for synthesis. the quick low down is, this patch is sending the position, rotation and scale vectors as midi control messages as well as a randomized arbitrary midi note map to ableton.

this is currently in progress but if anybody is into experimental ways of making music and wants to try this. i'm happy to supply the binary quartz builder and unity files as it seems things run smoother from the binaries.

here is a more detailed explanation from vimeo...

this is an early experimental multi-touch music project, i am working on. it is inspired by the reactable and utilizes the tuio protocol to transmit cursor data to unity via xTuio. in addition to being rendered in unity, quartz composer is being used to parse the spectrums peak amplitude for each instrument that is also being feed into the unity render engine via osc to visualize the waveform between the 3d music box nodes.

there are two types of nodes that both transmit midi note and control data to ableton live for synthesis. the two type of nodes are a synth unit and fx unit. although the nodes act as two different classes they are inherently the same objects transmitting the same data to live.

meaning the fx unit nodes also can act as a synth node and utilize its midi note data and like wise the synth unit nodes also send midi control messages. the nodes are transmitting a random note in the range of 0 to 8. this note is subsequently parsed and retransmitted to live in a particular scale. in addition music box is also sending 8 harmonic notes based on transformation logic as well as transmitting the subsequent rotation, position and local scale vectors for each node as midi control message.

so what this means is you could use a two finger pinch zoom to scale the node up which in turn can be mapped to an instruments amplitude or what ever parameter you would want scale to be used for. although this video is only showing positional usage both position and rotation vectors are sent to live as midi cc. doing a pinch zoom pivot rotation will give the xyz rotational messages to subsequently map.

the fx units are chained to the synth unit nodes via a spring joint so that the position of the fx unit can modify the location and rotation of the synth node by utilizing rigid body physics. meaning that the control messages of the synth unit can also be effected by the fx unit and vice versa. this is all experimental and there is no right configuration at the moment.

Humanoid Jam max-live-qc (Composition by dust)

Author: dust
License: MIT
Date: 2010.07.03
Compatibility: 10.6
Categories:
Required plugins:
(none)

so i decided to resurrect some old max files and put together a max - live - qc instrument. this started when i had no body to collaborate with and i decided i wanted my computer to play along with me. well now with the max - live integration i'm making this possible for everyone. i started with this in mind but now i'm more interested in not jamming but listening to the computer improvise.

live instrument:

http://research-ants.com/follow/followMe.amxd.zip

single midi track - minimal tech tutorial

http://research-ants.com/follow/noid.mov

example progressive house

http://research-ants.com/follow/noidHouse.mp3

how to load the patch http://research-ants.com/follow/load.mov

how to live jam http://research-ants.com/follow/follow.mov

how does the computer predict the notes i was going to play ?

first order hidden markov chains

no really that doesn't explain anything you input some notes via mouse or keyboard and then the computer calculates the probability of what you will play next based on a 3 note observation.

how do i improvise with it? its best to just go crazy and play all kinds of riffs in and see what happens.

the patch is quantized so that you can sync it.

inquire for the experimental humanized version.

http://research-ants.com/follow/humanoid.mp3

please note all this audio is algorithmically generated. the only human interaction is to tell is to tell it what key top play.

run qc patch to see graphed predictions