file tools in an iterator

colinyell's picture

Hi,

I'm trying to use file tools to save multiple structures to PLIST's from inside an iterator but file tools seems to stop working as soon as it is in an iterator.

Does anyone know of a way around this problem?

jrs's picture
Re: file tools in an iterator

yes

jrs's picture
Re: file tools in an iterator

Oh - you wanted to know how not just if anyone knew - silly me

Check out the very quick and dirty example I've hacked together - the main trick is the javascript at the very bottom level which I think was created by Steve (Morkis) although the version I've included is for 3D points and was taken from an example by Gerorge (Toledo)

If your looking to use the points/data outside of QC I suggest using Dusts amorim_xml plugin which can be found here - http://kineme.net/forum/Discussion/DevelopingCompositions/ExportingMeshl...

PreviewAttachmentSize
IteratorSaving.qtz56.17 KB

dust's picture
Re: file tools in an iterator

this is the most recent. http://kineme.net/forum/Discussion/DevelopingCompositions/ExportingMeshl...

the no loop is refereeing to me taking out the enumeration loop hopefully improving the performance. i can not say i have tested inside an iterator though. feel free and let me know.

jrs's picture
Re: file tools in an iterator

I haven't tried it inside the iterator either as I need all the points in 1 file and I'm also putting them into a queue to save the data over time. I've been using QC for stimulus generation in a physiological experiment for my PhD. I have an environment/interface that reacts to your attention (through gaze and brainwave/eeg tracking) and save all of the object and gaze positions out so I can load them into matlab for some offline analysis. Hopefully when I'm finished running the experiment I'll have some more time to post the compositions as I'm sure they will be interesting to many?

photonal's picture
Re: file tools in an iterator

Your PhD sounds interesting. I'd like to map some of the brainwave/eeg tracking data to audio - One might be able to use this to build a positive-feedback mood machine. i.e., if you're in a bad mood (or depressed, whatever!) one could try and re-harmonise some tones by concentrating on the process - might be a useful therapy? [The tones produced would equally help along the process in a positive feedback mechanism whilst reducing stress levels of the patient at the same time]

I'll coin the term now - before anyone else does : Cognitive Audio Feedback Therapy.

jrs's picture
Re: file tools in an iterator

aka neurofeedback - which is the area my PhD started in (using neurofeedback to treat ADD) but its since moved towards something a bit more functional. Currently I'm looking at using a behavioral feedback loop to improve object detection accuracy.

Whilst I'm only using the EEG offline at the moment I have some code that takes data from an open EEG device - http://openeeg.sourceforge.net/ - and sends the spectal power (alpha/beta/theta etc) to QC as an OSC message. It's pretty buggy though so I never released it - the plan is to create a QC plugin that takes the EEG data direct and allows you to control the frequency band and electrode etc - kind of like the kineme audio tools but for brainwave data.

Whilst QC is great for visualizing the data its no good for sonification - I was routing the data though QC to supercollider (sweet audio synthesis tool) but unless you have a pretty simple composition you can hear any changes in the framerate.

Why I didnt just send the data to QC and supercollieder separately? its because I was using QC to combine the EEG and Gaze tracking which both come from separate computers.

gtoledo3's picture
Re: file tools in an iterator

There used to be something available called the DICOM Medical Receiver plugin for QC (or something sort of like that), but I don't think it's only anymore. Maybe someone has it (or maybe I have it on an old HD?). I'm not sure if there's overlap... never really read up on it. It seemed like it was for brain wave visualization though.

A more flexible setup would be where something can push from the front, if there are times it's driven by something external like audio. For example, if a patch never draws anything except when mouse moves, evaluation doesn't really need to start from the consumer and go "up the chain" looking to the front, and then back (this is what happens now basically) x amount of times. The mouse event should just push the rest of the graph... similarly for something like audio, where you really need that to take precedence. Also, it would be really useful to lock in frame rate, or not... I remember that Pixelshox had this, but I don't really remember how well it actually worked.

QC is very extensible, a great interface for editing graphs that doesn't make you want to stab yourself in the eyes, gives a bunch of different ways to add new stuff, lots of ability to leverage different techs... it just needs to be even more so. IMO, other node based editors don't come close, and I would totally say if I thought there was something better at the moment. There is a need for something better though.

I see a bunch of promising programs and new frameworks, but they all get extremely platform oriented, don't have great ways to use or integrate pre-existing code, don't have well thought out graph systems (nesting insanities in QC for example, ultra-lazy evaluation at times, editor not being on a separate thread from viewer), or in some cases are kinda just big libraries of code with some templates... not bad at all, but not like a full fledged development environment... just a bunch of cool code. (end jag)

jrs's picture
Re: file tools in an iterator

Cheers for the insights

I agree with the pushing of data but this only works when you don't have a rendering backlog (ie if your half way through rendering a frame when a new audio event comes in you can't just start rendering again)

For a temporary solution I've been running two separate compositions with VBL syncing turned off. One lightweight composition that receives the gaze and EEG data, combines them and re-sends the data to another much slower composition for rendering. This is still not ideal though as all of the state information (object positions etc) is the main bottle neck and it doesn't sit well in either composition - really I need to bite the bullet and use something like a model view controller approach and move the state information into coca where it would run much better......currently though this is a heap of work as much of my state information and object control is done by interpolation patches, LFO's and stop watches etc and I don't fancy re-coding all of the different interpolation types etc.

I'm holding out hope that I'll have time to learn openCL and use it to speed up all of my object control and state etc - ie things like finding out which is the closest object to your gaze location, having objects movements interact with each other etc.

Re the DICOM stuff - http://sourceforge.net/projects/quickdicom/ - You can think of DICOM as a stacked/sliced image format - think the visible human project, it's namely for static data but can still be used for brain visualisation (ie FMRI or a CAT scan)

usefuldesign.au's picture
Re: file tools in an iterator

jrs wrote:
Why I didn't just send the data to QC and supercollieder separately? its because I was using QC to combine the EEG and Gaze tracking which both come from separate computers.

You could try dividing the the composition in two. First one performing the combination of the EEG and Gaze tracking and exporting the data. Second one importing the data and creating the visualisation. To run on separate threads you'd have to build on into an app, perhaps using Kineme's app builder program.

Passing data between comps is achievable a number of ways like Franz's structure plugin, Kineme's Spooky patch (leaky and prob not suited for moving large amounts of realtime data) and network broadcaster patch.

EDIT: Just read your most recent post, sorry I see you've already pushed this as far as you can!