ALL THE VIDEO STUFFS

The following is an email that I sent a couple of years ago, when I was advising on the video design for a production of Laramie Project. This is typically a video-heady show, and they wanted to incorporate some live cameras in the show, as well as learn Qlab. I still get a lot of the same questions about video production on stage, so when I found this diatribe, I thought it would be worth sharing here on the blog.

***

Ok, so I’m going to drop a bunch of science on you regarding video on stage:

Right off the bat, in trying to do this on a shoestring, as well as the easiest way to do it, is if you already have one of the new Mac Pro Towers. They are expensive but powerhouses, so you may or may not have one available, or lying around in a design studio. Qlab only runs on OSX. It will manage video playback and live camera feeds, all in one place, with custom geometry, etc, to be displayed on an number of “surfaces” (LCD screen, projector, etc). All it does is route the inputs/sources to destinations, and has a friendly Go button so anything can happen or change in one fell swoop, chained together however you want it. The Mac Pro comes into play because of the video outputs – HDMI and Thunderbolt – which is basically mini Displayport, in terms of video (the high throughput of this port means you can put other high octane devices on there, like video converters, drives, etc. So to avoid any confusion if you read something else…it’s more than just for video). Out of the box, it will support three 5k displays, or six Thunderbolt displays of lesser quality. (5k is basically the current top spec out there.)

So, if I recall our conversation, you have two video screens/TVs, a projector, and another one or two video “surfaces” for effects. So a full version of Qlab and a Mac Pro would handle this, given other details…

Video is a maze of codecs and types, but here are some facts to know:

  • There are digital and analog feeds.
    • HDMI and DisplayPort are digital. (Unrelated fact: most of the standards also can encode audio with those two feeds as well – at least 5.1 surround, if not 7.1 as well.) DVI is another connector found on some computers, which looks sort of like VGA below, and can be digital OR analog, depending.
      HDMI CableDisplayPort CableDVI CableThunderbolt, Mini DisplayPort Cable
    • In the Analog realm, you have VGA (normal computer), Composite (single yellow RCA plug), Component (three RCA plugs, Red, Blue, and Green, I think, which split up the RGB signal but also contain synchronization information…you might have seen this with your Playstation 2, still high quality), RGBHV (uses five BNC connectors, spits signal into RGB, plus Horizontal sync and Vertical sync), S-Video (which basically no one uses, or ever used, but it’s there as its own weird multi-pin cable, and it’s about the same as Composite without the convenience).
      VGA CableRCA CompositeComponent CableBNC CompositeRGBHV CableS-Video Cable
  • You think “Digital, great, what could be better, just use that!” Wrong. Can only go about 15 feet before the signal stops and you have to use a signal amplifier. Which is why you can’t find easily cables longer than that. While it is true you can go longer than that, it’s either an active cable or called something like that, which helps to amplify the signal. Or it is one of the newer specs coming out. Digital only goes for so far. Then, instead of degrading quality, it just drops out. Not enough bits? No picture. Analog, on the other hand, can run really far (100′ is common for VGA and RGBHV), and past that you just get diminished quality.
  • You can, however, run video over ethernet. Analog and Digital both allow for this and there are extender boxes available to convert.
  • Different specs top out at different outputs. Composite video, for instance, tops out at 480i (think DVD) whereas HDMI can give you 5k, and VGA can give you a whole wide range of output, it comes down to the encoders at that point. Furthermore, on VGA, it can be the same as the RGBHV mentioned above. So basically, RGBHV and VGA can be same thing, just different connectors. They work the same way.
  • I mentioned custom geometry above. Qlab makes it crazy easy to make a video surface into whatever shape you want, just click and drag, voila, you have a trapezoid or whatever that matches your masking and angle correction.
  • Wireless video…it’s very expensive to try and cut the cord. Sync & lag issues are huge, and I am not aware of any standards out there for wireless video feed yet. Don’t do it.

So, look at what hardware you have on hand. What does the projector support, the TVs, etc. Look for what you have in common. The great thing is, going back to the MacPro tower, is that there are dongles that adapt from Thunderbolt to a lot of different feeds. But, we have more things to consider:

  • Yes, you can run multiple videos at once on one machine. But know your codecs – some are power hungry, once you start playing a few different things and decoding them, it can become important what codecs you are using before things start slowing down. (h.264, while great for video streaming because of its file size, requires a lot of decoding and more system resources than, say, an uncompressed 422 video file. Further complicating things, the bigger the video file (like, longer videos) you can get bottlenecks in other ways. So it’s a matter of picking your codecs that work best for the situation.) Similarly, WAV & AIF audio files are way bigger than MP3s, but are better for live use because there is zero decoding involved.
  • Qlab prefers video encoded as “ProRes 422 proxy”
  • Once you know what you are dealing with, if you need to go with hardware video switchers, that will help figure out what you need. Extron, Folsom, there are a bunch of companies. Depending on how graceful you need the switching to be will change your cost. The more inputs and the more transitions will obviously drive the cost up vs something that handles, say, just RGBHV and cuts cold between sources.
  • You want camera feeds. Well, as we know now from studying the amazing NIN tour, the Kinect camera is a popular PC-connectible cheap camera for tours. High resolution, wide angle lens, cheap. But they use a proprietary connector, and I don’t know anything about how they get used in that environment. USB and Firewire cameras work, out of the box, with Qlab. And you can apply real-time effects and custom geometry. BUT…like digital feeds, USB won’t go further than 15 feet without an amplifier (neither will Firewire – notice a trend?). Plus, adding something like a USB extender will add to the lag you may already get from USB. So your camera subject has to be close to your computer, in that scenario. More on what cameras are supported, like Blackmagic video boxes, etc, click on http://figure53.com/qlab/docs/camera-cues/.
  • If not USB/Firewire cameras, then you’re probably looking at Composite, Component, S-Video and RGBHV connections. I don’t know what the digital connections are for cameras these days outside of Firewire, which is phasing out. There’s encoding and stuff going on there…it gets really hairy for someone like me who doesn’t do IMAG, etc every day!

Here is an idea for cheap video and multiple sources, if you don’t have a MacPro Tower or whatever. You can connect each source to its own computer, connect all of those computers on a network, and send MIDI or OSC commands over the network to trigger the videos to play – or anything else you want the computer to do. This gets us into cascading commands, like your main Qlab computer sending commands to other Qlab computers, run applescripts…really, the possibilities are endless once you go down that rabbit hole. This includes finding the closer source for your video camera mentioned above…maybe even apply realtime effects on a USB camera… That was how I did Wizard of Oz using live video feed “behind the curtain” and made him green and bloated – just controlled the camera feed of another computer remotely via MIDI commands to another computer that was located backstage. Obviously, the other nice thing about using Qlab is that any lighting board that has a MIDI port can then be slaved into the system, so your lighting and video cues work on the same Go button. And that can be fired from either place.

I would love to get a tour of whatever you guys end up using! Let me know what else I can answer.

***

This show ended up with a lot of screens, most being on the same feed and duplicated for effect. There were stacks of smaller TVs on either side of the stage, with a large, wide projection screen along the top. There was a live camera feed, which was driven by a shoulder-mounted “reporter camera” with an umbilical running off stage. Combined with effective use of strobes for the camera flashes in the news segments, it was an effective application of tech to an important play.