Different type of thought about QC and iOS...

gtoledo3's picture

There's been a great deal of talk about QC in iOS, exporting compositions/baked apps to various other platforms, etc., but seeing something that Lee posted about TeamViewer, along with reading some technical notes recently about the Opera browser on iOS (proxy based browser) got me thinking about something...

Could a qtz app be hosted and rendered on the server side of things, like a web app, so that a system viewing the graphic didn't have to have QuartzComposer framework, and also, could touch data (in the case of iOS) be forwarded via the web app and parsed so that it could do_stuff in the composition?

cybero's picture
Re: Different type of thought about QC and iOS...

Probably, I do remember a Java based approach to wrapping .qtz in a Java applet. Is Java supported in the iOS?

Just googled it and looks like its an unsupported but feasible thing and so back to looking for a purely server side renderer for .qtz.

I think it might need some kind of restructuring to the composition. Render on server - present in what format to be decided - presumably within a browser [proxy servers could prove useful] - although we've got Team Viewer mentioned - reminds me of what Skype and iChat supports - screen sharing and CamTwist - dropping in of effects. It's the touching data thing were this breaks down for me, even with TeamViewer, although I'm probably underestimating that one. Bake as an application seems the most fully realisable of approaches for data and control and even that fails the full graphic facilities. A baked application that relies upon the server to render the graphic and receive the data / control client side responses looks much more likely, if TeamViewer really does that, it's a winner :-)

gtoledo3's picture
Re: Different type of thought about QC and iOS...

Oh no, Teamviewer is just Remote Desktop type function no big deal.

I'm basically just saying, when a qtz is hosted on a website, does the thing really have to load via something like Safari, that then uses Mac OS X resources, or could something a server that ran OS X Server version host a qtz?

cybero's picture
Re: Different type of thought about QC and iOS...

Currently, yes it does have to render in such a fashion, in a browser, with a Quartz Extreme card running behind the scenes .

If one ran Quartz Composer on OS X Server as an application service [which would mean some re-assignment of how QC works to begin with, and we reconstituted the composition to be a server side based one so it could render and be received within the receiving application, could even be in a browser sans QC plugin, like on an iPad or iPhone, then I don't see why not, that strikes me as being similar to how aka & mrmr work, albeit being possibly done within a browser.

What media type would render I wonder, possibly H.264, as a kind of dynamic wired movie.

Basically, it's the rendering engine side that's the big problem, rendering and compressing graphics server side is not new, it's just that they tend to be heavily scripted server / client side arrangements to enabled environments. Effectively rendering in anything other than H.264 is probably going to downgrade the original graphical quality of the QC comp.

Interesting idea, raises a lot more questions though.

dust's picture
Re: Different type of thought about QC and iOS...

lets say someone at apple says yeah i think quartz composer should work with iOS. knowing that time will solve most of the problems someone should probably be working on making this possible now in a simulator or something, seeing that the devices of 2012 are not made yet ( or are they ? ) ironically the ipad was made before the iphone ever came out, so given the nature of planned obsolescence i'm sure there is a way of testing on computer of today what a possible iOS device will be like in 2 years.

obviously iOS can't handle what i do with qc today my computer can barley handle it. especially things like video and CL etc.. although sprites, text, gradients, lines, models, even shaders can be done now on a iOS device or well the newer ones. so would QC really need to be completely re-written in order to conform to the specifications of the gl es context in order to make a safe iOS patch ? it may just not be possible.

i know if someone makes the decision at apple now to say yes integrate qc to iOS in the future the device will be able to do what we do today. i mean an iphone has a faster clock speed than some g4's i have owned. i just think it would be a good decision to make now knowing the future will change before QC becomes just a tool for the graphic elite,and media content creationists.

iOS is only going to become more popular and more powerful. the devices hopefully will become cheaper.apple isn't going to abandon the iOS. so it will be a sad day if apple doesn't decide to integrate QC because everybody will be using iOSXI and only the small percentage of computer users that do use pro systems will be able to view a qc document.

don't get me wrong qc still needs to push the boundaries of high end realtime motion graphics but if its not integrated into iOSXI (OSX 11 incase you where wondering what iOSXI means) it will pretty much make the language pointless because the majority of computer users will eventually switch mobile computing devices.

there will always be the disparity of computing power between the portable and pro devices and i would hate to see qc alienate it self anymore than it all ready does. i mean half the computer users in the world today can not view a qc document so why alienate the language even farther.

so if you ask me i think qc would be a much better tool if it would at least work with all apple products. hi and low end devices. i personally can't wait till the iMac has a full screen digitizer so defiantly it needs to keep pushing the boundaries but also let the little machines have some qc fun 2.

usefuldesign.au's picture
Re: Different type of thought about QC and iOS...

I really like your idea Dust of an iOS safe patch mode for QC and a concurrent porting of QClite to iOS. (In you list off feature requests on the other thread).

In terms of growing the base of users of QC, what could be more of a shot in the arm than turning QC into an iOS app development environment that could produce a iOS app without hitting X-Code (or not too much at least). Would need lots of development of templates etc to do all the app-necessary-things but QCAppBuilder by Kineme did just that for OSX apps.

Some Adobe staff did their nut when the Flash-cross-compiler was kaboshed and I think that points to the thirst out there for a app dev tool 'for the rest of us'. The fact that Apple could make something that produces native code and extend the power of iOS (looking forward) at the same time is pretty cool, I think. The limited power of the ARM based A4 shouldn't be a deciding factor, after all you guys can run comps on your MacPros and MacBookPros that beachball my Dual G5. Not to mention I can't access OpenCL etc. That's life. Some QC comps would run very favourablely on iOS devices and some would not, as with any QC comp on any device.

Not to be rude, but serve-side, 'live' QC generation exporting h264 sounds highly 'niche' to me. The delay of touch response to image back alone would be enough to put me off. There's probably some case scenario George has in mind I haven't considered, I'm thinking more in terms of general use and appeal. That said, QCs quarantining to OSX desktops is definitely one thing holding it back from a general content creation tool. Imagine if we could post our comps as web pages viewable by anybody on the web — different league.

psonice's picture
Re: Different type of thought about QC and iOS...

It's certainly possible. You'd write an app that runs the .qtz in an off-screen buffer, then stream the images as a video stream out to the clients. You'd have to run one QC + encoder session per concurrent user, which would be pretty heavy, but I guess you could run at 320x480 or less for iphone devices.

Getting input would be tricky but not impossible. iOS 4 supports video within the browser or apps (yep, just a couple of weeks back we were on 3.x and this was actually impossible!) If you're doing this in a browser you'd pick up touch information using the touchMoved() etc. javascript methods, send them back to the server, and feed them to your .qtz. It's going to be laggy. Maybe >0.5seconds of lag. Maybe that doesn't matter in some cases, but it's something you have to look hard at.

Writing an iOS app to handle the client side would actually be fairly easy too I think. Adding a streaming video is easy, getting touch information is easy, the hardest part is probably handling the network connection to feed back the touch info.

Could be worthwhile I guess if you have a project where possible lag won't matter too much, and you're not expecting too many clients active at once (or you have a server farm and plenty of bandwidth ;) Otherwise, we have to keep waiting and hoping apple see the benefits of iOS + QC.

dust's picture
Re: Different type of thought about QC and iOS...

as per the server side things and rendering qc so you can view it at a decent frame-rate on an iphone is possible now. i mean i do it all the time in my lan at home even on the wan on campus. just ask lee he knows just about every program that will stream / video conference a qc file known to man and some aliens.

i set up qc all the time as a baby cam with effects. so if im outside or downstairs or something in my house i can see my kid or get notified by motion sensing if she wakes up from a nap on my ipod by doing what is called a http "server push". http://en.wikipedia.org/wiki/Push_technology

i mean i make the stream small and limit the frame rate usually as there is an obvious latency when screen grabbing to push. unfortunately a java pushlet its called doesn't work on an iphone but probably an android i have no idea i don't have one of those things yet.

i know someone built a tuio server in python or something i haven't tried yet but supposedly it sends multi-touch data through the web.

its also possible to build a custom iOS app that tracks your touches and sends a http header service request string with your touch data packed in it to a server that that writes the data to an xml file that a qc file is parsing while it is being screen grabbed and pushed out to your apps webkit view. i tried it like mentioned above its slow.

i'm sure there are some internet gurus out there that would really know how to optimize the process with segmented streaming or something.

this is unrelated but if you want to browse the internet on a multi-touch table like you do with an i-pad i posted a modified cogee webkit plugin here that you could easily add tuio to. can't say it is stable like iOS safari or anything but was a novel idea.

if you go to apples safari dev page there are some really interesting javascript css animation things that will let you do some qc style ease in an out type of animations with multi-touch in safari. like coverflows and stuff just search css animation and you will find a lot of pretty cool things.

put it to you this way you can do way more things to video with multi-touch on an iOS device server side through safari than apple lets you do on the device with its media frameworks. when i mean way more i mean way more. you can pretty much just play a movie on an iOS device using official apple iphone dev sdk stuff. ok you can adjust volume and overlay buttons and stuff to a video with iOS sdk as well. with the safari mobile webkit stuff once the movie is buffed into safari you can do all the regular multi-touch pinch zoom rotate cross fade etc... video or do custom gestures other css animation stuff.

you can't even auto rotate a movie based your iOS devices orientation legally or officially on an iphone. the ipad does this automatically its pretty cool but sometimes its a pain when your like in-between horizontal and vertical so you have to lock it down. so i have figured out how to do the multi-touch in safari etc on an iOS device. i'm sure in the future there will be more things ported over to the mobile javascript safari camp. things like accelerometer for instance.

cybero's picture
Re: Different type of thought about QC and iOS...

So, instead of a server side renderer / responder, simply enable .qtz's of an OpenGL ES conformant type upon iOS devices?

usefuldesign.au's picture
Re: Different type of thought about QC and iOS...

cybero wrote:
So, instead of a server side renderer / responder, simply enable .qtz's of an OpenGL ES conformant type upon iOS devices?

Next step, embed that whole OpenGL ES conforming iQC framwork in Windows ver of Safari for the other mob.

cybero's picture
Re: Different type of thought about QC and iOS...

We're back to that old chestnut :-) - well, anything is possible, given sufficient time and effort. Lots of possibilities. It would be nice to have some kind of .qtz streaming server set up. If any latency issues ensuing were addressed, it would help to shoulder such a data / protocol burden.

cwright's picture
Re: Different type of thought about QC and iOS...

cybero wrote:
If any latency issues ensuing were addressed, it would help to shoulder such a data / protocol burden.

I have yet to see a reasonable way to address c -- thus, there will always be more latency over this kind of mechanism.

(see also OnLive)

cybero's picture
Re: Different type of thought about QC and iOS...

OnLive - great fanfare , slick interface, new kid on the block - Problem Kid :-).

Also look at the re-jigging of those unlimited internet / broadband deals prevalent not so long ago [O2 / AAT] .

Bandwidth, bottleneck, kludge & cost.

Moving things client side and supporting on the recipient platform via a framework set with minimal network involvement, facilitating up any local net action & inter device action sounds far more feasible.

I can't actually see that as being supportive of all that Quartz Composer currently does.

However that doesn't mean that some sort of shared, cross platform functionality cannot be achieved and possibly to a higher level than I might pessimistically believe to be a realistic estimation of how limited such a service might be .

leegrosbauer's picture
Re: Different type of thought about QC and iOS...

Did somebody say streaming Quartz Composer? That's easily do-able. Requires integrating a bunch of apps, however. http://kineme.net/wiki/StreamingQuartzComposer

cwright's picture
Re: Different type of thought about QC and iOS...

I think the feature was "streaming + interactive" -- in that case, the client needs to send events, and the remote end responds.

This is theoretically sound, but in practice there's enough latency to make it undesirable (from my experience thus far).

For non-interactive stuff, it's totally doable (and even mundane things like iChat could be called QC-streamers, in a primitive sense)

leegrosbauer's picture
Re: Different type of thought about QC and iOS...

Yeah. Interactive. Latency. Yeah. I ran some tests last year with a friend in which I screen-grabbed his incoming Skype imagery (hand and head movement) and fed it to teapotti.qtz. I then returned the live teapotti imagery to him via Skype. It worked .. but the latency did indeed ruin the desirability factor.

dust's picture
Re: Different type of thought about QC and iOS...

people have actually been studying the psychophysical effects of low latency interactive audio visual networking for some time now. psychophysics is in case you want to follow along.

Quote:
Psychophysics has been described as "the scientific study of the relation between stimulus and sensation" or, more completely, as "the analysis of perceptual processes by studying the effect on a subject's experience or behaviour of systematically varying the properties of a stimulus along one or more physical dimensions".

thats the wikipedia definition thought that might be easier to understand than my text books on the subject.

what it boils down to in layman's terms is the point in which someones goes "f this", i can't play music like this or video game like this. as an example using an iOS device as a youtube remote control. how long does it take to say "f this" vnc connection and get off the couch to select a different video. so the person observing this action writes down the "f this" point is at x milliseconds.

so psychophysics studies things like this among all kinds of other interesting things. or at least i found interesting while studying it last semester.

for instance here is a video made in 2006 of a remote piano duet. http://imsc.usc.edu/dip/CaliforniaStreamIN/movie_20060606_hq.wmv at a 50 millisecond latency there is no apparent "f this" point and you can play piano remotely.

this is done with a distributive immersive performance server. as the video proves in 2006 it is possible to provide a realistic human experience while maintaining a synchronous HD and multitrack audio streams. the single greatest limiting factor for human interaction in a immersive environment is the effective transmission latency.

the maximum allowable latencies range is from tens of to one hundred milliseconds at most depending on the experimental conditions and content. now given there is latency the clocks are going to shift or slowly as time ticks. so the solution to keeping distributed clocks signals synchronized is to use the timing signals of GPS. GPS is capable of maintaining synchronization among distributed clocks with an accuracy of 10 microseconds or better.

a good source if your wanting to try and build a low latency, real-time audio -video acquisition and rendering system is http://imsc.usc.edu/dip/system1.html this info is in reference to DIP 1 and written by RN De Silva. If this interests you and would like a more current abstract of the psychophysical effects of internet latency then check the ACM portal library as there are various more up to date case studies.

its one thing to be able to do this on a college campus. its entirely another thing to implement a system like OnLIve is proposing on our current broadband connections. Although it is entirely possible where as with OnLive you are only sending hatswitch booleans or up, down, left, right, jump etc... where as the DIP experiments are dealing with distributive and synchronized multi-track audio and video interactions which are way more expensive as far as packet size is concerned.

gtoledo3's picture
Re: Different type of thought about QC and iOS...

I've never thought that it would be remotely possible to do this, and that latency would be horrendous, but reading about the Opera iOS browser not using webkit, and also supposedly compressing up to 90% of the graphic content on their own servers, to accelerate rendering with their browser made me wonder otherwise. However, after reading about it more, it seems like what they are doing would only be most efficient with non-moving images.

cybero's picture
Re: Different type of thought about QC and iOS...

Static images definitely suit server side rendering.

An active, client side type of cross platform framework similar to and capable of wrapping .qtz [or another plist/xml type system would be nice [perhaps it could be using HAML templating too, just for neatness and sustainability's sake].

I think It'll happen sometime some way or another that we'll see a truly compelling rich media cross platform framework supportable by means of local distro / app individuation.

The notion of providing an application facility suited to a given client hardware from a server is not exactly new - virtualization springs to mind . The key requirement would be a level of co-operation that might not be achievable.

As ever , where there's a will, there's a way [& conversely true also].

dust's picture
Re: Different type of thought about QC and iOS...

there are also companies out there like akamai that are transparently mirroring (copy) the internet and caching it geographically to speed up the net, making things like segmented HD streaming and apple system updates faster etc..

im all for speeding up the internet but not entirely sure if i like my computer contacting their servers and updating them with my computers information without being asked to do so.