Now Playing on 8bitX:

Download the pls file

Broadcasting MAGFest 12: Bringing Nerd Music to the World

Part One: How Did We Get Here, or “Good thing we didn’t screw that up.”

A week before MAGFest 11 I received a call from Will.  One of the main guests, Yuzo Koshiro, had asked the MAGFest staff if it would be possible to do a video broadcast of his DJ set on Saturday night.  Normally we were going to work with Arecibo Radio to help broadcast audio of the various performances.  But this was a new wrinkle.

Thankfully I had joined the 8BitX network a couple of months earlier after winding down my old podcasting website.  I also still had all of my streaming equipment.  So I was prepared for doing a video stream if needed.  All of the gear came with me to the festival… believe me, it’s a lot of gear.

Originally we were only going to stream Koshiro’s and Dj CUTMAN’s set.  But at the last moment Will made the decision to go ahead and stream everything for Saturday night.  I’m glad he did because the stream went over really well especially with having little-to-no time to promote it.  The MAGFest staff were extremely happy with the quality of the stream.  So much so that they had stopped watching their own internal stream and started watching ours.

Shortly after that we were asked to stream the Game Over Austin show, and then after that BitGen.  With BitGen we were given an opportunity to use some new equipment that would allow us to have multiple camera angles.  That provided a new learning experience, and the 8BitX team did a great job (unfortunately I had to miss this show).

Because of what we were able to pull off after MAG11 MAGFest had started a new department specifically for coordinating the streaming capabilities of various different groups.  And we took lead on putting together the video streams for Main and Second Stage.

I guess we demonstrated that we knew what we were doing.  Or else we were really good at faking it.

Part Two: Equipment and Setup, or How the Video Sausage Gets Made

Putting together a single source (camera, game console, etc.) stream is pretty easy and well documented on the Internet.  But getting a multi-source broadcast put together requires a lot more equipment and planning to put together a good broadcast.  First let’s go over the equipment we had on hand:

Cameras/Accessories:

3x Canon Vixia HF M301

1x Canon Vixia HF R20

2x HDMI-SDI converters

Main Stage Broadcast:

Blackmagic Design ATEM Television Studio (switching controlled by Macbook and ATEM software)

Core i7 2600k based system

Blackmagic Design Intensity Pro

Second Stage Broadcast:

Core i7 930 based system

AverMedia HD DVR capture card

When it comes to cameras the main thing you want to look for is a camera that can send a ‘clean’ video signal over its HDMI output (no info overlays, etc.).  If you also plan on recording directly on the camera while broadcasting make sure the camera is capable of sending a high resolution signal while recording.  Some cameras will stop sending signal while recording, while others may send out a lower resolution image over HDMI.

The Canon Vixia series of cameras that were used are capable of doing both of those things.  Which is impressive when you consider they are consumer level camcorders.  There are a couple of features that I wish were available like they are on prosumer level cameras, but given their cost they will work well.  Though there are still a couple of things you will want to look out for when using these kinds of cameras.  Those I will cover in the next section.

Having a clean signal sent over HDMI is great and all, but the standard isn’t meant to be run over long distances.  Thankfully we had two HDMI-to-SDI converter boxes handy .  With those converters we were able to run one long thin cable from both cameras that were set off to the sides all the way to the tech island (at least 50 feet).  For the camera that was the middle wide stage shot we ran a normal HDMI cable since it was sitting right by our broadcast switcher.

This was the first time I had used the Blackmagic ATEM Television Studio switcher but I was familiar with its capabilities since I had researched possibly using it for my previous podcast.  It’s pretty powerful for a device that costs just under $1,000 but there were still some things that were lacking.  The Studio has the ability to store images, but A) the way to upload them is a bit complicated, and B) it has very limited space.  If we didn’t need to have multiple sets of graphics for between sets, logo overlays, etc., then we quite possibly could have reduced the amount of hardware needed for broadcasting.  Unfortunately more hardware was required, but thankfully it was really easy to integrate.

The Television Studio has the ability to send a signal two different ways: 1) a compressed signal over USB2 to a machine capable of running Media Express for capture (which I wasn’t aware of at the time), 2) an uncompressed HDMI signal.  We took the HDMI signal and ran it into a Blackmagic Intensity Pro running XSplit 1.3.  While we needed two separate computers to pull this off, it did allow us to combine the switching capabilities of the Television Studio and the scene creating/switching power of XSplit.  This also allowed more than enough CPU time to process the broadcast so any issues that came up would not be due to processor bottlenecks.

Second Stage setup was a lot simpler in terms of equipment.  One camera run directly into the HDMI input of an AverMedia HD DVR capture card running XSplit 1.3.  In both cases audio was a mix of audio from the tech island mixer and a couple of microphones set high above the audience (AT2020’s in the case of Main Stage, unknown for Second).

Since Second Stage and Chip Showcase (which was in the Main Stage ballroom) ran at the same time, only one camera was used for these performances.  We had considered having a second interview camera specifically for Second Stage, but we weren’t able to pull this off due to a lack of equipment.  Which probably ended up being a good thing given some issues I ran into preparing for the festival.

Part Three: Lessons Learned, or “Let’s hope no one noticed that.”

A: You can only cover so much ass.

Considering we were being asked to broadcast both Main and Second Stages, which would include the US performance debut of two different bands (one of which I happened to be a fan of) I wanted to do everything I could to make sure that we be able to stream these shows the way we wanted regardless of technical setbacks.  That turned out to be more difficult than I had thought… though not in a way I expected.

Considering we were doing a multi-camera setup for Main Stage using the Television Studio, I tried replicating the multi-input capability.  I was already capable of supporting two HDMI sources in our main streaming system through the use of both the Intensity Pro and the HD DVR cards.  So all I needed was a third input.  Given the price I had originally went with adding a second HD DVR card.  But before I did so I reached out to XSplit support to make sure the software would not have issues with having multiples of the same card in a system.  Split Labs said they were not aware of any issue with such a setup.

It turns out that when you have two of the same AverMedia card in a system that while the second card will be recognized as working by Windows, only the inputs of one card will be captured.  It’s even specific to what card was present ‘first’.  Even if the first card is disabled within Device Manager the inputs of the second card will not capture.

One possible workaround for this is to use three different input devices, all from different manufacturers.  The problem with this is that not all devices have the same capabilities within XSplit.  For example one device I had tried was the Diamond GC1000 USB Capture Device.  This unfortunately does not have direct capture support in XSplit, and I would have to use a combination of the DMCap software along with the screen region capture option within XSplit itself.  Since I was only planning on using one monitor for the Main Stage system this would not be a workable solution.

Due to time and money constraints I unfortunately couldn’t find a solution that could replicate the multi-input capabilities.  So we’d have to hope that the ATEM wouldn’t fail, or else we’d have to drop to two cameras.  Or get really, really creative.

Secondly I also tried to make sure I had enough adapters, cables, etc. just in case there was a missing gap somewhere.  Which when working with things like HDMI cables things can get pricey rather quickly.  Thankfully it turns out we didn’t need any of it so all of it was returned.  But I did learn something from this: do everything you can to find out what the venue/convention/festival has on hand first before attempting to purchase/transport all the things.

B: Find a common level for communication beforehand

I had worked with one of the members of the 8BitX crew on video projects in the past, so they were used to my working style.  But everyone else on the team was basically ‘new’ for me.  At first this had led to a couple of issues in camera direction, but we were able to get them sorted during the production.  It did make me realize that I should have established a baseline for everyone’s knowledge level before the festival, and based on that use a form of common communication for everyone on the team so all would know what was going on at all times.  Thankfully the rest of the 8BitX staff are quick learners and we had things smoothed over on Day One.  But it is something to remember the next time I work with people who may not have a similar skillset.

Part Four: Conclusion, or “…we didn’t suck?  Oh good.”

Considering everything that was going on with this specific project I think we did a really good job with our broadcasts for MAGFest 12, especially when you take into account the varying experience levels of everyone on the team and what they were asked to do.  Also this shows that you do not have to be a professional broadcaster to put out a quality product.  With enough planning and resources any group should be able to do something similar to what we did.

Given that it had been some time since I directed such a large project I had concerns about whether or not we would be able to pull everything together and make not only a workable broadcast but one that people enjoyed watching.

Since we’ve been asked to do this again next year, I guess we succeeded.

Robert Swackhamer, Director of Video

CyberChimps