Ali Demirel uses Kineme in series of Richie Hawtin's M–NUS events

interactive cube with blue lights

This year, Richie Hawtin's label m–nus celebrates its 10th anniversary with a series of interactive performance events called CONTAKT. Ali Demirel, Hawtin's visualist, uses a combination of Quartz Composer and other software to produce the show's video and also to interact with the audience. Here he talks about his use of Kineme to do this.

A little background: the most unique feature of the CONTAKT events is a lighted cube (pictured left) which contains an RFID scanner. Users who register in advance on the CONTAKT website have RFID chipcards which they can hold up to the cube's reader to interact with it in various ways.

Interview with Ali Demirel

Ali: This year we are doing a series of events called CONTAKT which has the concept of communicating with the audience on an advanced level. We have an interactive cube at the event, members with RFID cards who come to the event and identify themselves will be displayed in my visuals.

String with URL and Particle Tools in action together

A member comes to the event, approaches the cube, waves his RFID card, the cube identifies him, and the name is written to a file on the Mac Mini connected to the cube. I am connected to that Mac Mini via network cable, and I use the Kineme FileTools "String With URL" Patch, insert the network IP address of the Mac Mini, read that file with the name of the member, and integrate it with a Quartz Composition I have. I trigger and mix Quartz Compositions via VDMX in my performance. I don't do immediate triggering of the identified name because sometimes they don't fit in with the image I perform, so when I see a new identification, I wait for the right moment and trigger it.

Before our first performance, I had some cache issues with that Kineme patch — it was not displaying the latest name sometimes. I posted this issue on the Kineme forum, Christopher added an "Update" input to that patch, and it worked perfectly at the show!

Here are 2 videos from an audience member which shows what I'm talking about:

In the first video, you also see another Kineme plug-in based composition used for my live visuals — I took a ParticleTools sample composition and adapted it to my visual set.

Beth: Can you explain about the way this cube interacts and how it was set up?

Ali: The visual patterns on the cube are not directly controlled by me — we are thinking about this development for the next release of the cube. At the moment, those patterns are controlled by the Mac Mini which is connected to the cube, simply changing patterns when there is an interaction like identification, upload, or download.

Beth: Which of the visuals are pre-rendered and which are generated live/interactively?

Ali: I use 2 laptops and a video mixer to produce the final output. One of them runs only Processing compositions, and the other runs VDMX triggering Quartz Compositions. Each composition has some variables which I can control manually or via MIDI. I also receive MIDI information from Richie Hawtin's mixer and assign it to some variables. However, mostly, I follow the music and match it manually. A good example would be the Kineme ParticleTools Quartz Composition titled "Fire" — I use this because it gives me very rich graphics and dynamics.