Reality Distortion Field

cwright's picture

My recent fun adventure: http://perpendiculo.us/?p=155

discuss. (perhaps the future of QC is bleak?)

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

cybero's picture
Re: Reality Distortion Field

Sorry to hear that you were not successful.

Well, if Apple don't want to continue with the QC technology, I can only hazard a guess at what they will use to replace it with, seeing as how they are currently up to their proverbial elbows with that technology within their applications and OS.

Perhaps they'll Open Source the technology.

Of course there is Open CL & Grand Despatch and the business decisions regarding Apple's top line applications.

There is also the possibility that you are simply misjudging somethings to do with that unfortunate rejection.

It could easily be the case that almost all Apple's internal departments have been told to shrink back on new staffing ascensions, even though they no doubt have to focus some resources [afresh] upon bedding Snow Leopard into place and exploiting 64 bit technologies run on that OS.

cwright's picture
Re: Reality Distortion Field

yeah, I'm not trying to use it as a definite "qc is dying!" rant or anything of the sort. It is a bit peculiar that the bug reports keep stacking up though (not just in QC, but in SL generally...), but, as you said, there are numerous reasons for decisions. :)

cybero's picture
Re: Reality Distortion Field

cwright wrote:

yeah, I'm not trying to use it as a definite "qc is dying!" rant or anything of the sort. It is a bit peculiar that the bug reports keep stacking up though (not just in QC, but in SL generally...), but, as you said, there are numerous reasons for decisions. :)

It strikes me that part of the logjam might well be because they [Apple] could be waiting for ATI to catch up Open CL wise, along with a whole slew of 3rd party software and hardware manufacturers with SL's 64 bit promise in general.

The same will be even truer for Microsoft and its 3rd party soft and hard ware vendors, despite the opportunities Microsoft have had to mature 64 bit operability within Vista, prior to Windows 7 appearing.

dwskau's picture
Re: Reality Distortion Field

Obviously they just couldn't afford to pay you what you are worth. And who could?

gtoledo3's picture
Re: Reality Distortion Field

Ah, cat is out of bag!

cwright's picture
Re: Reality Distortion Field

yeah, win or lose, I wanted it recorded (it's not every day kineme gets to infiltrate the infinite loops palisade to chat with the QC devs ;)

toneburst's picture
Re: Reality Distortion Field

Congratulations and commiserations, cwright. From what you said in your blog though, I wouldn't necessarily say QC was doomed- but then, you have more information than me, and actually talked to the people involved. It would be an odd move to ditch QC at this point though, given it's so tied-in with exactly the kind of technologies Apple are currently so keen to promote. Also, it seems strange to have spent so much time (and therefore, money) on a major update to a product, to then write it off almost immediately.

Pure speculation though, unencumbered by information. ;)

alx

toneburst's picture
Re: Reality Distortion Field

Absolutely.

SteveElbows's picture
Re: Reality Distortion Field

Im fond of speculating that Apple had QC in mind as an important part of their expansion of multi-touch into larger form factors, but that maybe they've decided not to do a full OS X multitouch device (eg tablet) in the near future, and to stick to the iphone-type model considering the success of the app store, along with other factors such as battery life.

SteveElbows's picture
Re: Reality Distortion Field

The other thing I like to ponder is whether Apple had/have plans to use QC mesh filter's in one of their apps, eg Final Cut Studio's Motion. This clearly didnt happen with version 4 that came out not so long ago, but perhaps its still possible if they do a refresh of that product to utilise SL better in future.

cwright's picture
Re: Reality Distortion Field

thanks :)

I don't feel it's doomed, it's just in an interesting position now (given the dynamics of the state of SL, CL, QC, and the QC dev team) that is a bit concerning for the time being. I really don't expect it to disappear (especially in the near or even mid-term future), I just also don't expect massive wild changes either. And it's literally right there on the cusp of allowing really cool new stuff.

[on the plus side, it means we have a bunch of cool stuff we get to work on ;)]

nothing wrong with pure speculation -- I do that far more often than is healthy :)

cwright's picture
Re: Reality Distortion Field

From what I've seen (my time there had exactly zero exposure to anything of this sort, so it's all macrumors-type insights -- even iphone exposure was limited to like 2-3 questions, the overwhemling majority being GL and performance related queries), the tablet is closer to a large iphone/multitouch device than a multitouch desktop OS X device (yes, iphone runs OS X, but you get what I mean, hopefully).

cwright's picture
Re: Reality Distortion Field

ATI supports CL on their newer hardware -- it's just the immediately-pre-SL hardware that isn't supported (and might not ever be, and might not physically be possible due to limitations -- I really don't know, and speculate that it's more software related than hardware related). Either way, they only ship a limited number of ATIs, so that's not a big deal (in a very high level sort of way, mind you).

Waiting for 3rd party support is silly, simply because 64bit 3rd party apps were totally possible in Leopard from day 1, and were limitedly possible on Tiger (command-line apps only, more or less, since most of the higher level frameworks weren't 64bit at that time). the only twists are QT's general inability to deal with 64bit (or threads, or colorspaces, or ...), and Carbon's non-promotion to 64bit (except for CarbonCore, which actually is 64bit).

SL's kernel boots to 32bit by default on everything but a few select machines, and that's totally fine. it's no different than Leopard, except that 64bit kernel is possible (some of the time).

Window's struggle for 64bit is their endless versioning (how many versions of vista are there? which versions of XP support multiple processors? which ones support 64bit mode?) -- it's unnecessarily complicated, completely opposite of OS X's setup (there's Server, and non-server, and they both come in 64 and 32bit combined, and both ran on ppc and x86 (until SL), so to the user everything was completely transparent. Much easier to deal with.

toneburst's picture
Re: Reality Distortion Field

cwright wrote:
[on the plus side, it means we have a bunch of cool stuff we get to work on ;)]

Very true. It's a shame it looks like driver-lag is holding us (or me, at least) back from making proper use of the most exciting new features.

a|x

SteveElbows's picture
Re: Reality Distortion Field

Yeah I get ya, Ive probably been reading similar sources to you so my vague hope for a QC-capable tablet has been much diminished in recent months.

gtoledo3's picture
Re: Reality Distortion Field

Well...FWIW...

I had someone ask me to supply code yesterday on a bug where I clearly state that I take the default code that is in the OpenCL kernel, highlight it, and hit delete, for QC to crash, when in 32 bit mode - with nothing connected, no rendering, etc.

On another bug, I had someone try to reproduce a 10.5 bug in 10.6, when I clearly stated it didn't happen in 10.6. The reason I'm sure of it, is that it happens on 4 different computer/gpu types in Leopard, but on no computers in SL. Soooo, whoever tried to reproduce the bug either didn't read it, or I don't know what... maybe they tested with a 2d image instead of video (another condition of the bug).

Yeah... they asked me to supply their own default OpenCL kernel code, on the OpenCL bug. Either they didn't read what I wrote, they can't understand it, or...? I'm miffed. I don't think they're testing bugs on the hardware/software they get reported on if they're having problems reproducing. I must be taking more time than them, since I many times test out bugs on two different machines, sometimes more - if it's really bad, between some friends I have access to a tower, an imac, and a last gen mac mini.

If the machines they test on never have bugs, I want to take my MBPro in and have them hand me over whatever the hell that is as a trade. I'm seriously considering taking it in for return, as I think that NVIDIA 9xxx series may just basically... not work. I've had 68 crashes (actually more, but I lost track of the ones that happened before I did a reformat of my HD), you know what I'm saying? I've had around 40 kernel panics. It also feels like it gets way too hot, imo.

Anyway, tons of bugs, no fixes in sight...doesn't give good vibes :) If I pay for something, I expect it to work. SL is clearly supposed to work on the computer, as it was sold to me.

cwright's picture
Re: Reality Distortion Field

(note that apple's bug reporter reviewer army isn't necessarily technically inclined, and usually not software-development inclined as well. So, like the iphone app store, it'll be pretty hit-and-miss depending on your luck with the reviewer :/ if you're particularly lucky, Troy will snap up the report before a dweeb jumps in and doesn't know what's going on - I like when that happens :)

gtoledo3's picture
Re: Reality Distortion Field

Yeah, in Apple's defense, I've been contacted before and told that if anything gets too crazy with Bug Reporter to feel free to let them know more directly by just emailing the QC "team".

It goes further than technical inclination; it indicates that the person can't actually read the English language. They must be skimming the Bug Report or something, and not reading it word for word (maybe having been infected by the general A.D.D that goes along with creating bugs in the first place). I clearly stated that the code is in the kernel by default.

If I knew NOTHING about QC and had never ever used it, I still wouldn't write back to ask for the code, because it would be clear to me that it had nothing to do with user supplied code, and that we had everything we needed. (I say this having had to vet software bugs like this on a regular basis at a past job... sometimes I wouldn't be familiar with the particular app, but common sense and actually reading the words in front of your face goes a long way).

SteveElbows's picture
Re: Reality Distortion Field

If I had to guess Id say that OpenCL is littered with bugs as opposed to there being something inherently wrong with the nvidia 9XXX series.

toneburst's picture
Re: Reality Distortion Field

I sent an exasperated email to the Apple Quartz Composer Dev list after the nth apparently OpenCL-related crash- you may have seen it. Someone got back to me to suggest that part of the problem may be that there is no memory-protection on GPUs, therefore it's relatively easy to crash them with bad code. On the other hand, the compiler (which is hardware-specific) should be able to spot malformed code entered in an OpenCL Kernel patch, I'd have thought, so again, it's back to driver issues being a likely cause...

I don't know if anyone can remember that far back, but I wonder if this kind of issue cropped-up when OpenGL was first adopted. My guess is that it did.

a|x

gtoledo3's picture
Re: Reality Distortion Field

I'd say "both". I think the NVIDIA chips are dogs, they suffer massive design flaws that I'm not sure can be mitigated with crafty software, and will slowly massacre themselves into heat related death. It pains me to say this, and I'm really uncertain if I want to sit here without an MBPro until they figure it out.

Anyway, on Chris's deal... blue Mazda? What, no Prius? Is it just me, or did anyone else think it was funny and ironic for a Prius to be used in the OpenCL folder examples (lots of hype, under-horsepowered).

cwright's picture
Re: Reality Distortion Field

Ha, when OpenGL was first introduced it was a complete crapshoot -- games often had several different pipelines for each GPU so that things would work right. (that's still the case sometimes today, but for performance reasons -- each card's driver has a different "fast path" for optimal framerates).

The Memory protection thing is technically valid (as in, there's no MMU/virtual address space on GPUs), but like you, I also think that's sorta a copout. the driver's clearly able to know the size of each buffer (due to the CL api -- I know this with certainty because I've actually written a rather abysmal proof-of-concept hybrid CPU/CL pipeline test to justify us not actually investing any time in CL for the time being due to severe performance problems when handing data back and forth between the CPU and GPU) so it should be able to do the right thing even at weird edge cases. And if unprotected writes are able to bring down the system (as is apparently the case), that SEVERELY reduces the actual validity of CL -- I mean, if an app crashes, big deal, it sucks. But if every crash takes the system down (or at least poses a legitimate risk of doing so), we've reverted back to OS 9/Win3.1 days. Not very beneficial...

cwright's picture
Re: Reality Distortion Field

(smokris drives a Prius -- expect your account to get banned, and a C&D/DMCA violation notice in the mail shortly ;)

I don't know if the chips themselves are poorly designed, or if they're just poorly enclosed (heat-related issues, at least) -- the latter would be Apple's case design engineers' fault, not nvidias. Driver stuff is pretty clearly nvidia's problem though (ditto for ATI).

at the core, a GPU is simply an array of very limited purpose CPUs -- instead of maybe 8-16 cores, GPUs can have hundreds. They can't deal with the kind of code CPUs deal with (jumps/branches everywhere, kernel/user mode switching on the fly, etc), but for raw compute stuff (like CL, surprise!) they're hard to beat (in theory - my cl test app shows that getting data back off the GPU for CPU processing takes longer than just computing the data on the CPU, but the computation used was very simple, and more complex math would hopefully start to pull ahead after a while). Since they're simple, it's difficult to say if there's a physical design flaw in nvidia's silicon...

toneburst's picture
Re: Reality Distortion Field

cwright wrote:
...if every crash takes the system down (or at least poses a legitimate risk of doing so), we've reverted back to OS 9/Win3.1 days. Not very beneficial...

Well, it's not every crash that takes down the OS. Often it seems to be a QC hang (that might recover itself potentially, if you leave it long enough), but yeah, there's a chance it might either leave the machine in an unstable state, lock the whole OS completely, or cause a Kernel Panic. Crashes seem to happen so often though, that you're going to experience one that takes down your system sooner or later. I had two or three yesterday, and lost count of the number of times QC locked-up or just silently quit.

I must admit, I've not been massively impressed with the speed, either. I guess some of that might be down to Iterator slowness still, or just the overhead of multiple OpenCL kernels running at the same time. I may have to try the old 'Iterator unrolling' trick again- but that way lies a massive time-sink, and I'm not sure I want to get into that again, after my experience working on the previous maths surfaces QTZ.

Incidentally, wasn't there some talk a while back about automatic Iterator unrolling as a possible future feature of Kineme Core?

a|x

cwright's picture
Re: Reality Distortion Field

toneburst wrote:
I don't know if anyone can remember that far back, but I wonder if this kind of issue cropped-up when OpenGL was first adopted. My guess is that it did.

another cool problem in more modern times is legacy compatibility -- there are a large number of games written that expected certain GL quirks/behaviours that are no longer the case. As such, they now misrender when running on modern GPUs. So even though GL's long established now, its problems are far from over :) I can only hope CL's more canonical definition will help keep this kind of problem at bay (compute leaves a lot less wiggle room than "fog" or "per vertex lighting" etc).

cwright's picture
Re: Reality Distortion Field

Yeah, every crash isn't a panic, but every CL crash (on the GPU, not CPU-land bugs and problems) could risk that -- imagine a cl kernel overwriting some GPU control structures (maybe some dma parameters, if those are GPU-accessible, for example), and then having the GPU gladly blast arbitrary data wherever it wants in system memory (since it's unprotected, unlike an app itself) -- you could cook the kernel (panic), or, if you're particularly crafty and malicious, you could perform various exploits. it'd be gpu dependent, but I bet some college research team could work out something awesome without taking too long.

That kind of risk is absolutely unacceptable -- then a malicious composition on a website could in fact root your machine, destroy your data, etc. Before people freak out, I don't know if that's even possible, but there may be potential...

P.S. -- how can you not be absolutely blown away-amazed at the new speed? ;) I loved hearing people claiming it was miraculously faster, but finding my own (measurable) tests to show only modest improvements. (Kinda a shame they said no -- I had a small portfolio of few-line tweaks to regain a few ns here and there per patch... looks like we'll have to make a plugin to do it manually ourselves...).

Can you find some time to do an iterator test? I'm actually really interested to see how that goes. Getting performance inspector up and running in SL (with iterators) is a fair amount of work I haven't had time for, so having some numbers to point at would be cool.

gtoledo3's picture
Re: Reality Distortion Field

Well, I actually LIKE the Prius (and want one... at a former job, when they wanted to promote me in a position I wasn't crazy about, I said I would do it if the company gave me a Prius to drive for as long as I held the position), but because it makes sense given my criteria for car buying. I wasn't saying that it didn't have a great deal of horse power as a dig, though as it applies to Apple's side of things, it is a bit of a dig I guess.

I think you're right about them being poorly enclosed. I'm really skeptical about the layout of the internal components. Designing analog preamps and compressors is definitely different, but I know that the level of heat, and the parts that are in the vicinity is something that I wouldn't find acceptable.

Granted, there may be some great specs on how this stuff can withstand certain temperatures, and I may be totally wrong... it's just that common sense, and a pretty good "feel" for that over many types of hardware tends to make me think that a macbook is about a 3 year proposition with a slow spiraling death, and maybe some serious hardware failures and engineering flaws along the way, that if you're lucky, you can get Apple Care to fix, and if not, they will come up with some lame b.s (like not fixing the top crack in my white macbook because of "liquid damage" that doesn't exist). I can predict that I will be completely beside myself if this MBPro fries, and every past experience I've had with hardware is telling me that my better judgement would be to heed my instinct. I'm absolutely sure that some of it is driver related, but I'm also getting weird stuff like walking back to my computer, and when my screensaver should be on, seeing a garbled jumble of GPU crap that seemingly may or may not have to do with a qtz that I worked on earlier, or even days before. I don't even know what to say about that. I'm at a bit of a loss on filing the bug on it as well. I'm to the point where I'm going "can this even be fixed?" However, piddling around in Vista this morning made me want to fling the computer off the table, so whatever :)

THIS is why I want OS X, and to just make my own computer. I don't care about making it cheap and crappy like this Psystar/Hackintosh garbage, I would make it better than what I can buy from Apple.

SteveElbows's picture
Re: Reality Distortion Field

Well there were specific issues with some 8600M's in Macbook Pro's if I recall, but I see no evidence that nvidia in general, or Apples use of the 9XXX in macbooks is flawed in the way George is suggesting.

I think its more likely that we are suffering a triple whammy of OpenCL bugs, graphic driver bugs, and QC-specific CL bugs.

I was extremely hopeful for the future of OpenCL, although my performance expectations were based on using future GPUs down the line more than what is achievable at present. My spirit was dampened only a little by the SL launch CL bugs but as time goes buy without the issues being addressed, I lose hope and even ranted on the mailing list about it a few weeks back. I still hope it will settle down, but its certainly shaken my faith somewhat.

SteveElbows's picture
Re: Reality Distortion Field

Well I would not be shocked if Apple sometimes got the balance between heat and looks/volume of fans wrong. However thats one of the reasons I got a macbook & then a macbook pro, because I actually want to use it on my lap, and many laptops have stupid fans on the bottom which I wont want to block with fabric etc. And the whirring noise from PCs I used to build slowly drove me insane.

Im generally lucky with hardware but even so I treat he 'computer dead after 3 years' scenario as distinctly possible no matter who makes it, things generally arent built to last these days and its going to take some time, and resource crises, to change that.

SteveElbows's picture
Re: Reality Distortion Field

Whilst on the subject of CL woes, does anybody else get a strange mouse phenomenon when in QC doing something that is very intensive in OpenCL? Its like the mouse sampling rate/lag goes all funky and it isnt smooth anymore, jumps from position to position in a jarring way. Ive had this fairly consistently on macbook pro with 8600GT M.

Anyways Im just having to get used to adjusting my expectations, I now look on OpenCL as something that could ripen into a joy one day, but is not safe territory to be walking on at the moment.

SteveElbows's picture
Re: Reality Distortion Field

I would not be completely shocked if they had a policy of asking for code even when told that the code comes from Apple, just to rule out the possibility that the code has been altered somehow. OK maybe this makes no sense either, but if I lacked specific knowledge and was reading bug reports for a living, Id probably end up asking the person who filed the bug some tedious questions that annoyed them too.

cwright's picture
Re: Reality Distortion Field

yes, this drives me insane (as in, I gave up on QC4 after trying N-Body, and losing my mouse).

I think it's because CL Kernels operate on the GPU, which can't be interrupted (at least on OS X), so while its doing its thing other tasks (like handling mouse interrupts) kinda stack up without getting executed in a reasonable timeframe. On Vista/Win7, the maximum length of time a kernel is allowed to execute is something tiny, like a tenth of a second or something (with registry hacks to change it) -- I know of no such limitation on OS X, which is a horrible idea if it does that while disabling interrupts (it'd be interesting to see if other interrupts, like network and harddrive are also squelched for the interval -- if so, combined with the unprotected memory nonsense, and we've got an absolutely terrible design. Clearly someone wasn't thinking about malicious uses...

SteveElbows's picture
Re: Reality Distortion Field

Oh and if you have heat fears, something like iStat Menus might help, plenty of realtime heat & power useage stats available there which can sometimes help either reassure or confirm worst fears.

dust's picture
Re: Reality Distortion Field

well i must say to remain positive chris, if this is something you want more than likely you will get it some day, if you remain positive. your on apples radar now, and who ever it was that recommended you, will more than likely be in a position to make hiring decisions someday. i just say that because most of the software guys at apple seem to last about 5 years before they venture out and do there own thing, which you are all ready doing. the products and plug-ins you share with people greatly improve the qc experience, so from a users perspective we all know this, and so does apple. it seems apples IMG group is an intrinsic element to apple's team seeing so many apple buyers use there machines for media related tasks. as far as the future of qc being bleak, it most certainly has a bleaker outlook without you being involved with the qc dev team but as far as it getting shelved in the near or mid future i don't see that happening at all.

cwright's picture
Re: Reality Distortion Field

seconded! I love iStats (or menumeters, in older contexts). that, and smcFanControl -- both handy tools :) (one friend of mine has to run smcFC all the time with the fan maxed, or his machine hardlocks due to heat after a while. No way to fix that otherwise...)

cwright's picture
Re: Reality Distortion Field

Note that I'm not angry or depressed or looking for sympathy or a pity party :) I'm a bit disappointed (for a few reasons, some listed below), but that's about it. So please don't take any of this as "chris has a bad attitude/is despondent/is negative" etc. -- I'm perfectly fine (and wrapping up QuartzBuilder 1.2!), just wanted to present some information to the QC community with a glimpse from the inside :)

dust wrote:
the products and plug-ins you share with people greatly improve the qc experience, so from a users perspective we all know this, and so does apple.

From the sounds of it, not so much. The majority of my resume was devoted to kineme stuff, and exactly 0 questions came from that part. I was asked if I knew Objective-C multiple times (... dur? have you effing looked at when I post code in bug reports, on mailing lists, or does the fact that we've released several products, and many more freebies not register with "this guy can write software"? Have you noticed how I've hacked the objc runtime to hell to get some of our tricks to work, like KinemeCore and GLTools?), I was asked a bunch of GL questions, and a few comp-sci style questions, but nothing about kineme technology. I was asked how I'd improve QC, how I'd improve user relations, how I'd build interest in it, and what my plans for the future would be, and those were all really long responses/discussions :)

So anyway, my take is that apple's not too serious about qc, and doesn't have much of a plan for it for the short term future. they have no idea what users want/need/are interested in, they don't know which bugs/limitations annoy users, and they don't know what kinds of problems qc can solve. With that many unknowns, it's easy to see why a requisition would get rejected, but it also doesn't build confidence in QC's future (you can't go somewhere without vision or goals).

leegrosbauer's picture
Re: Reality Distortion Field

At another time and place perhaps, I too would really like to hear your thoughts on building interest in Quartz Composer.

SteveElbows's picture
Re: Reality Distortion Field

I suppose the possibility that they just have a counterproductive rigid corporate interviewing method/standard questions system in place should not be ruled out?

If you had got the gig I would have been both excited, but concerned about being blind in future - you are a billion times better at communicating the realities of QC than anyone at Apple is allowed to be, and I fear that would be lost if you were behind the aluminium curtain.

I dont like investing my time in platforms with unknown futures and where communication and release schedules are murky. But probably not all of these QC problems are Apple or corporate specific, Ive used various realtime software for about a decade now, node-based programming where possible, and its usually suffered some similar issues. Limited resources, a small userbase, the userbase being further segmented by the large number of different things users are trying to achieve, not enough money etc.

cwright's picture
Re: Reality Distortion Field

that should be a thread for another discussion -- I'm sure a lot of users here would have equally (or even more) interesting thoughts and things to say regarding that :)

gtoledo3's picture
Re: Reality Distortion Field

Well, the key would be for Apple to basically be integrating the solutions that Kineme stuff offers into their own scenario...

  • Not locking in releases to OS updates.
  • Absorbing the function of 3rd party stuff on their own terms.
  • Not "breaking" stuff and reducing function.
  • Having one on one contact and not shrinking from questions, or being afraid to say something doesn't work or justify decisions. (In audio engineering, I wouldn't be surprised to be contacted by the heads of a company to explain why something works how it does, though I'm talking about gear more expensive than Macs, so they ought to. I fully expect anyone to disclose if something doesn't work right, or be prepared to justify thought process... or simply say "that is private and not for public discussion or debate, instead of a sidestep/non-answer).
  • Taking time to actually look at what people are doing beyond the most cursory and superficial observation.
  • Making sure that "innovations" haven't been surpassed in more stable ways before implementing them in house.
  • Small contests and public events to hype the tech.
  • High level abstraction for app building... ala QB.
  • Limited implementations on stuff like iPhone, or possibly on Windows through frameworks held in iTunes or Quicktime (I'm positive that no one but me would have the nerve to suggest that to Apple... that would be one of the first things out of my mouth, with which I'm sure I would be ushered out the door!)

  • Then, I'm sure Chris would say a bunch of stuff about performance, multi-threading, disclosing more code, etc :)

At this point, there is basically a big market for a QC editor and viewer that works correctly in SL.

This is all pretty blazingly obvious stuff. Then again, this is better fleshed out somewhere else than this thread.

cwright's picture
Re: Reality Distortion Field

it's likely related to budget and corporate structure, among many other things. they have their reasons, and I'm ok with that.

I completely agree on the bitter/sweet aspects -- I love being able to release "when it's done!", not when the next scheduled update is. (that lets us do awesome stuff mid-cycle, like Kineme3D or ArtNet, or KinemeCore, or anything of ours, really). I sincerely hope they commit to being more open on the list/through documentation, as this appears to be a common issue people bring up. I actually like not keeping people in the dark - even if there's no hope, it at least lets them know they've been heard, and that they aren't talking to a wall. psychologically, that means a lot.

I think you're dead-on with the general nodebased thing. That's not just QC though (CL/GPU drivers are apparently suffering from similar issues as we speak). It's definitely all cutting-edge, very cool stuff, but without the financial backing of more boring projects (iWork, office, etc.).

I'm sure they have a plan B or something for QC -- without it, how will they showcase all their cool new stuff? :)

gtoledo3's picture
Re: Reality Distortion Field

The Obj-C thing makes me chuckle when I think about it.

QB 1.2 is going to make people poop themselves. Uhm... in a good way. What a weird metaphor.

gtoledo3's picture
Re: Reality Distortion Field

Well, as far as node based stuff goes, Cycling 74 (MAX/MSP/Jitter) has been beating the drum for quite awhile now. I'm not aware of them having suffered problems, but I haven't been a user either. I just remember it coming out way back when in an audio context and using it to see it work, and being amazed at seeing all it can do currently. I do agree that I've seen many others cycle in and out, etc.

SteveElbows's picture
Re: Reality Distortion Field

I started another thread about this topic:

http://kineme.net/Discussion/General/BuildinginterestQuartzComposer

SteveElbows's picture
Re: Reality Distortion Field

Yeah max/msp/jitter has stood the test of time, but I would not say its evolution was without problems. Im not a giant fan of cycling74 but they have clearly been doing something right. I will be very interested to see if Max for Ableton Live takes things to a new level in terms of people dabbling with node-based stuff.

cwright's picture
Re: Reality Distortion Field

From what I recall (mostly as a silent historian, as I've never actually deal with msp), its history (especially early on) was plagued with stability and performance issues (not unlike what we're seeing now, perhaps?) -- also, 3rd party extensions became basically required, while still suffering stability issues themselves (kinda like some of our plugins ;). I believe most of that's ironed out now, but as I stated earlier, I've honestly not worked with it or its users enough to know details, and I could be completely mistaken.