Custom Multi-Touch Gestures (Composition by dust)

Author: dust
License: (unknown)
Date: 2009.09.12
Compatibility: 10.6
Required plugins:

so i have made a new source file for custom multi-touch gesturing using the tuio protocol. this is using the same method i have been doing before but i have added the multi-touch environment as well a time element for added positive gesture results. i was going to use the value historian and parse out my arrays for a copy and paste into java script so i can have some pre recorded gestures but the VH isn't working for me in SL. i have recoded a two min video to verify and explain the gesture recording process as it relates to recording and the run-time evaluation mode in order to match a gesture. my tolerance settings are pretty tight .01 on x an y and point .05 on time.

with my last attempt i was only matching the sum total of my gestures x pos which was a little flaky. with this iteration i am doing a true point to point match with three elements and a sum total tolerance as well. to start recording a gesture you need to put one finger down and double tap with the second finger then draw your gesture. the global time of the gesture is 2 seconds so if you are recording gestures keep that in mind. to enter runtime evaluation you need to touch down with all 3 fingers. once you have enabled the eval you can do your gesture at anytime. something i will add in the future is a offset to x and y so no matter where you are on the screen the evaluation will be the same in addition im going to add some square roots to evaluate multiple size gestures. right now this is only doing a 1-1 map at the original scale.

so this is a 10.6 comp but it should be safe down to 10.4. the requisite plug ins are gl-tools but that is only showing some cursor triangulation and not required for the gesture map to work. maybe there is a circle from 10.6 but again that is only for drawing a cursor and not needed.

g.qtz112.97 KB

dust's picture
Re: Custom Multi-Touch Gestures (Composition by dust)

feel free to replace the multi-touch logic with a mouse if you do not have a mt surface / table or i touch to work on. i will record some example movies for use with community core vision so you can test without in multi-touch mode. just leave a comment if you have community core vision or another blob tracker. i use my ipod for testing because it gives perfect blobs. you can use the MSA remote or OSCemote for multi-touch datagram sending via the tuio protocol. Im attaching a multi-touch program for you to build onto you iphone or use with the simulator) if you don't want to buy one. It uses OSC and sends the requisite data to be able to get these gestures working.

I am using the MSA remote and the tuio protocol because tuio showcases my work on their site plus it sends some other relevant info that i do not have to calculate myself like xspeed, yspeed and maccel. Seeing my program uses the CGPoint context you will have to take the xpos and subtract 240 then divide by 240 to get into gl context or coordinates that you can use in QC (-1 to 1etc.) for the y you will have to subtract 170 from the ypos and then divide by 170 and multiply by -1 (invert) to get into correct context for QC>y coords.

dTouch.zip321.98 KB