About what do AV designers talk? Design certainly, in all of its forms. Past projects and wish lists. Perhaps most of all, we talk about technology. For all of our talk on these things, there are relatively few actual product reviews or comparisons. I’ll talk about products here, but stop short of a formal endorsement or non-endorsement. Why is this, and what are the perils of doing so? I can illustrate with two examples and an apology.
First, Infocomm 2013. AV_Phenom Mark Coxon saw the dizzying array of HDMI over structured cable extension systems and decided that an old-fashioned “Shoot out” was in order — he’d take a selection and compare for the benefit of the rest of it in the industry. Right away he ran into problems.
1. Acquiring Appropriate Gear
Manufacturers pretty much universally outwardly agreed that this was a great idea. When it came time to actually get sample products, roadblocks appeared. Their trade-show demos were strapped down to permanent displays. They were missing power supplies. They didn’t know if they had the latest firmware updates. Some of these might have been legitimate issues. Some might have been a lack of comfort with the risk of taking part in a showdown under conditions they couldn’t control. For whatever reason, it’s something manufacturers aren’t quite comfortable with.
If you read the original post from last year, you’ll see that at the first try none of the extenders worked. At all. Blank screens all around. Removing the extension system and running sources directly to the display, of course, resulted in a perfectly clear picture. Every element of the test had been proven good except the extension system. Which means that the extenders were bad. Right?
Not right. Replacing an active HDMI cable with a 99 cent special made everything work. So the problem is the active HDMI cable. Right?
Not right. Months later I saw the same problem — an active cable not working on the back end of an extender. Replacing it with a seemingly identical cable made the problem go away. What the issue appears to have been is a defective cable. Not quite so defective as to give no picture, but marginal enough to fail with some equipment.
Notes on Digital Video
Side note on digital video: as I’m sure you’re aware, a digital signal is just a string of ones and zeros. One way to measure the integrity of such signal would be with an oscilloscope. Ideally, ones should be very high, zeroes very low, with a clear sharp transition between. This is called an “eye pattern.” As the signal attenuates and picks up noise the “eye” will flatten and become less sharply defined. Different receivers have different “eye masks” – their tolerance for imperfect signals. The problem with this kind of test is that, absent some rather costly and complex test equipment, it is impossible to determine where the signal is degrading, how, and to what extent. We’re left with the binary “it works/it doesn’t work”. Which leads to the final issue:
3. They all work
Coxon’s result was something I could have told him before he started: all of the extenders were able to pass video. After all, for a manufacturer to sell a product which simple doesn’t do what it is advertised as doing would be rather shocking and result in a short life for that manufacturer. Yes, there are secondary tests he could have performed but didn’t. Unplug and reconnect the video source to measure sync time. Unplug and reconnect the power to compare startup time. The larger point is that these kinds of devices have become somewhat commoditized; not only do many have the same function, but they also have similar form-factor and, under the hood, use the same chipsets. It’s ultimately a comparison of apples to very slightly different apples.
Are there manufacturers with product lines and ecosystems better suited for one application or another? Yes. Are there some which offer better reliability and better customer service? Also yes. I’m not quite ready to say that digital switching and transport is a pure commodity in which any device is equivalent to any other. What I AM saying is that they’re close enough that, absent a great deal of time and equipment, it’s quite challenging to make meaningful performance comparisons.
Which brings me back to the beginning, in which I owe somebody an apology.
A few months ago in my visit to Extron post, I commented that their XTP switcher changed sources slowly, especially when switching between unprotected and HDCP protected content. This is true; it was unacceptably slow and much more so than other, similar products. So, when one of my colleagues (SMW senior consultant Joe Gaffney) received some Extron demo gear and saw that it switched very slowly I found myself unsurprised. Fortunately, this was a test in the comfort of our office and Mr. Gaffney is quite diligent about getting things right. After watching the indicator lights on the front of the unit and making several calls to Extron, he determined that there’s a setting to drop the HDCP handshake when non-HDCP sources are selected. If this setting is turned off, it has to initiate a new three-way handshake every time a protected source is selected. Hence the long wait time. Turning it off made the unit behave much more reasonably.
Is this what happened with the XTP demo at Extron’s demo facility? Without an actual XTP matrix I can’t say for certain, but I must admit that it’s a possibility. While a manufacturer always should be sure their demo is configured to show the product in its best light, we need to remember not to take first impressions at face value.
The moral of the story? Sometimes we all get things wrong. Evaluating products is hard. It’s OK to make judgments, but make them carefully and be open to the possibility of revisiting them.
Those morals are a bit more universal than the world of AV, aren’t they? Perhaps therein lies another lesson.