The Joys of HDCP

In the “digital age” a lot of things have changed in the way content is delivered from source to destination. Video information being broadcast in high definition with very little loss in quality
introduced the need by content producers to protect their intellectual property from being duplicated and redistributed in high quality.

Enter HDCP.

What is it?
Short for High Definition Copy Protection, this scheme uses encryption to scramble the signal between source and destination. HDMI is the exclusive interface that is used with HDCP protected content, and licensed sources are forced to greatly reduce quality on any non-protected output. Eventually manufacturers will be forced to abandon analog outputs altogether through a phased in “analog sunset”. How do content producers force this? From my understanding, it is part of the licensing agreement to implement HDMI output. (Please correct me if I’m a little off on this) Since the manufacturers want to stay current and implement HDMI, they must also follow the enforced analog sunset.

Some people my remember “macrovision”, a technology that messed up colors and messed with the sync, making video unwatchable if you tried to use a second VCR to copy a tape playing on the first. Unlike the old methods, HDCP uses a digital key exchange between source and destination. Since the signal flowing is digital, encrypting the stream blocks unauthorized devices from tapping into the line and showing/recording the high definition content.

Although there are exceptions, typically TV’s and projectors all support HDCP. It is important to note that displays classified as monitors may not fall into this category as HDCP is a licensed technology and implementation adds cost and restrictions to the device. You can always check the specs to make sure everything is HDCP compliant.

Do I need it?
Currently the only technologies implementing HDCP are Bluray and HD tuners. In the future, we will start to see certain computer-delivered content implement this protection. In my opinion if you are implementing a digital solution, you should try to include HDCP compliancy built in. That being said, it may not be required in systems where only in-house content is shown.

How to implement HDCP
To ensure you are HDCP compliant, every device in between source and destination must be stated as HDCP compliant. The fun starts here, because just doing this step will not guarantee the system will work. Why? With HDCP, each source is given a finite number of keys. The number of keys are not published in spec sheets, and will affect how many displays you can connect to one source even if your DA is HDCP compliant.

Troubleshooting
It can be really challenging to troubleshoot HDCP related issues. The “warning” that HDCP has failed is unfortunately the same “warning” you get when the cable is unplugged…NO SIGNAL. Any test equipment to help troubleshoot problems are out of the price range of pretty much anybody as the transmission speed over HDMI requires very expensive hardware to diagnose. (To put it into perspective, I think Extron has ONE high bandwidth oscilloscope in their entire organization for development and manufacturing QA purposes that is capable of viewing the “eye” of a digital signal… again correct me if I’m wrong)

I have found that rudimentary troubleshooting techniques are the only real way to resolve these issues in the field. Isolate the variables and test. Bypass as much equipment as you can and try to isolate one source to one destination. Slowly add devices in the signal path one at a time, and see what makes everything fail. Keep in mind that you may have problems with cable length and quantity of keys which may require design changes to fix.

I know this is a hot issue in the AV world. Stay posted for more information. If you have any comments, stories, or techniques onHDCP issues please leave a comment below and let us know your thoughts.