Installation Feedback

mattgolsen's picture

So I'm doing building a fairly large QC installation that will initially consist of 18 displays. I've got a few questions for people who have done something of this size as I've never done anything remotely close to it.

Firstly some knowledge of the layout will help. I'm dealing with an area that is about 10K square feet, with network cable already run to each area that will contain the displays. There is a network cabinet at each location which already contains several switches, and can get a bit toasty. My biggest concern is if I should have a node (MacBook Pro) at each one of these locations, running with the lid shut inside each box, or if I should run video over Cat-5E and centrally control each node from a locked room. Limiting access is a must as this is a permanent installation and I can't stay with it (the facility is open 16 hours of the day). Each node will display custom information (via XML) dependent on the location in the building, so each one will initially have to be setup by hand until I can figure out the best way to build a central patch to control all of the nodes (I've seen ygBox, but it's not exactly what I need, but similar).

Any insight on the best way to build a master network control patch would be phenomenal as well. Identifying the separate machines, sending multiple urls and other variables to one node etc would be highly helpful. I've managed to muddle myself through to an early alpha stage, and I'm not sure how to proceed from here.

So really what I'm looking for is an advice on the actual physical installation experience people have, whether to dedicate a MBP for each node (either at the installation point or in a physically secure location), and any other advice you all can throw at me who have done something similar.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

bernardo's picture
just a small comment. Don't

just a small comment. Don't run macbook pro with lid closed. sooner or later it will over heat. instead use a mac mini or something like that. as for the rest sorry but i can´t help

mattgolsen's picture
I was hoping for a little

I was hoping for a little more feedback, but I guess I'm just going to have to wing it. Wish me luck!

franz's picture
iMacs ?

if the content is not projected via a beamer, but shown on a computer display, i would use only iMacs, as the screen is included, and you can rent them for around 300€ a week or so. Eventually, i would make 2 programs (using Cocoa+IB+QCview), making extensive use of the network broadcaster patch. 1 master program that sends control data (if needed, although in what you said i don't think there's a need for a master CPU) 1 client program, to be copied on every machine, with a minimal set-up config, just enter the ID (manually, or by automatically loading a file that contains ID info locally, like a Plist).

If iMacs don't suit you (too big, no need of the screen) then i would go for mac minis. But NOT MBPs, especially with the lid closed, as they might overheat. Plus MBPs got stolen quite easily compared to iMacs.

Hope this helps, feel free to ask more if needed.

One last advice: think minimal.

cwright's picture
logistics

Just as my personal preference (having essentially no experience in what you're doing), running "unusual" stuff over a familiar cable is typically a bad idea (video over cat5, scsi over parallel port, midi over XLR, etc). It's possible, it usually works (signal attenuation notwithstanding), but it's Amazingly confusing to people in the future who expect cat5 to mean network/internet (maybe power-over-ethernet).

I second (third?) the lid-closed problem -- they overheat quite easily, so don't do that.

a machine per screen would be nice if it's in the budget -- that way, a system outage doesn't take down the entire installation.

franz's picture
right, although...

you're definetly right. However, i found that sending Midi through XLR (=DMX) cables was very reliable, added to the fact that venues generally come equipped with DMX wiring evrywhere. It is just simpler to send MIDI through existing DMX cables (or audio XLR) rather than putting by hand a new cable. This comes from my personal experience....

cwright's picture
mixed bag

It works well from our experience too (we did that several years ago for exactly the same reason).

My hesitancy comes from plugging in devices that are expecting one set of voltages/signals, and having them receive other stuff instead, sometimes with disastrous results. Nuked harddrives being the first thing that comes to mind, though perhaps an amp blow out from midi->xlr->audio system happened once as well?

smokris's picture
Nuking Drives

cwright wrote:
Nuked harddrives being the first thing that comes to mind, though perhaps an amp blow out from midi->xlr->audio system happened once as well?

I think the only real problem we've had is that during one gig back in the late 1990s, we accidentally plugged a scsi drive into a parallel port (since both used DB25 connectors), and it fried the drive.

MIDI operates at +/- 5 VDC. Typical XLR audio runs around +4dBu, which, if you run a sinewave at full-blast, is around 2 VAC. So if you run MIDI into XLR audio, you would get some clipping but probably nothing disastrous.

I've blown out more amps with microphone feedback... :^)

mattgolsen's picture
We're going to be using 37

We're going to be using 37 inch LCD's, and probably the newest unibody MacBooks. We basically have an unlimited supply of both, I'd use Mac Mini's if we had any.

I thought about stripping the LCD off the laptop and running them headless ( a little ghetto), but not sure how that would work out.

The plist is a great idea, that hadn't occurred to me.

mattgolsen's picture
The great thing is that all

The great thing is that all the Macs and LCDs are essentially free. The only thing I have to do is rebuild them from an always replenishing pile of broken ones.

The heat part definitely worries me though.

mattgolsen's picture
I totally missed these

I totally missed these replies when they happened, I was without power due to bad weather for a few days.

The great thing is, the environment I'm in I have an unlimited supply of Macs. And LCD televisions... at no cost :D The one bad thing is having different video adapters because I have to deal with what we get in, and rebuild each one from other laptops myself.

The basic setup I think I'm going to go with it mounting the laptops on the back of the LCDs in some way, with the lid partially open, with the back end pointed up for the greatest amount of heat dissipation. We already have over 100 network drops in each location, which helps immensely, so we won't have to run any cable.

Next up I have to figure out the best way for updating these 18 displays whenever I make a revision to the composition.

cwright's picture
network + thin client

The easiest way to update the compositions would be to load them all from the same location on a network share. Then, have the app periodically (every 5-10 minutes? Maybe once an hour/day? depends on how urgent you want them to go out I guess -- or, have a push notification or something) check for updated versions, shut down the current one, and reload. If done carefully, the graphical output wouldn't go blank, it would just pause (as it's loaded) and then jump (as the first new frame is rendered).