I’m Bringin’ Codecs Back! What?

If there has been any trend in AV that is glaringly apparent over the last few years, it is that the amount of hard codecs being sold for video conferencing is declining.  Although videoconferencing is being more widely adopted, the hardware sales dollars are decreasing, in a huge extent because the codec has gone to software, the cloud, or been virtualized somewhere and being shared by many companies as needs arise.

I even wrote a blog in 2014 about the phenomena asking if Polycom and Lifesize would be the next Blockbuster.

The hard codec essentially became a casualty as efficiencies in virtualization and improvements in internet bandwidth came full circle.  Companies decided they didn’t want to own and manage all that infrastructure, and that the new soft codec based options had become “good enough”.

Sure there are always holdouts.  Those companies with such a huge sunk cost that they just continue forward or those that really do have quality and security needs that make them slightly uncomfortable with a solution hosted elsewhere.  However, those situations are becoming fewer and farther between.

Good enough could be achieved at 1/4 the cost of great, so the march of commerce and convenience again trumped quality.

So why on earth, with all of these events in motion and well beyond the point of no return, would I declare to now be bringing codecs back?  Well, if the mantra of real estate is “Location, location, location!”, then the mantra of AV should be “Application, application, application!”  And the application at hand here (pun intended) is…

Haptics. Specifically the Internet of Touch or telemanipulation.

I was at SIGGRAPH this year and I ran into the control room guys from Christie.  They partnered with TechViz to create an amazing 3D projection demo where you wear 3D glasses and look at the layout of a virtual piece of machinery.  You reach out and grab their haptic joystick, and then manipulate it to reach inside the piece of machinery to find a bolt.  Here is the really tricky part.  The haptic joystick is connected to a resistance arm.  If you try to press it forward when you are up against the side of the virtual piece of machinery in front of you, it doesn’t move.  You have to move your tool until you are lined up with the opening, and once you are, the tool will move forward in 3D space again.  It’s like a giant, incredibly complex game of Operation for engineers.

Now if you’re thinking, “Wow Mark!  That’s cool but completely irrelevant to anything but gaming”, you’re wrong.  Start to think of the applications for telemanipulation.  Remote control of robotics in remote areas, manipulation of tools for remote service, control of NASA’s Robonaut, or of course even telesurgery.


An article about the last item, telesurgery, actually inspired me to write this.  Think about the ramifications of anything but the highest level of real time audio visual and control of something like a surgical instrument and then ask yourself if you’d trust this type of communication to anything but a high-end on premise system with built in redundancy.

Your video communication drops for 2 seconds right when you are cauterizing an artery during microsurgey and a patient bleeds out on the table.  Your haptic controlled robot lags a few extra milliseconds and the command to stop moving forward is delayed, creating a laser incision in the lung during heart surgery.


Now consider that the engineering and service examples could be just as critical to life safety as well.  It all adds up to the codec being an extremely relevant piece of hardware in a situation where haptics are involved.  One that may even have greater implications for new encoding and decoding strategies as well, given that the existing protocols were meant for voice and video, not touch.

If you are in the integration business and want to leverage your hard codec experience in a market that has virtually disappeared, I suggest you look toward specialty applications across verticals where the hard codec may have just found its new home.

The hard codec isn’t dead after all.  It has just moved from the conference room to the control room or the operating room.

So go ahead, get your codec on.  We’re bringin’ codecs back.







About Mark Coxon

Mark Coxon is an AV industry native and blogger for the rAVe BlogSquad. You can reach him directly at mark@marketexplosion.me.

  • Simon Dudley


    Interesting piece. The specialist world of touch is certainly interesting and gives us lots to think about in the Video Conferencing space. One could argue about whether it’s only about tele medicine. I’m sure there’s other applications where it would be of use. There’s certainly lots of applications where feedback would be very useful.

    However I will say that your argument about HW codecs coming back doesn’t stand up to scrutiny. I see zero reason why a Software application could do this type of work. In Fact it would be easily arguable that to write this in a stand PC format would be much easier than creating a new Hardware platform or attempting to convert an existing platform to incorporate touch.

    Stand platforms such as PC’s and Mac’s are ultimately jack of all trade technologies. They are pretty good at most things, but not specialist. I could think of no better example of why a SW codec, coupled with a stand alone platform, millions of people are already are familiar with, would be the ideal platform for the sort of ideas you’re discussing.

    Good article though and as always thought provoking.

    Keep up the good work.

    Kindest Regards


    • Mark Coxon


      Thanks for reading and chiming in. Given your background in the space, I always am interested in your opinions on these things.

      As a quick aside, I agree that you could in fact dedicate an on premise PC to running a specialized piece of software to do the encoding decoding.

      I would also argue that you essentially just built a hard codec, as what is a hard codec other than small dedicated PC running software for encoding and decoding?

      You could also use a light source to illuminate a polarizer and create video and not have to use a “display”…

      Seriously though, The point is that off prem, virtualized, cloud based systems, despite their efficiencies in a pure VTC environment will most likely not meet the “good enough” criteria when it comes to systems whose real time operation may mean life and death.

      Whether the final solution is a hard codec with preinstalled software, or specialized software sold as a service to be installed on a dedicated PC for the same purpose, the fact remains that the degree of liability involved with the hardware/software’s reliable operation will require implementation by a preferred partner and will come at a premium over “free” , “good enough” solutions, therefore creating a new opportunity for integrators to focus their attention on and drive revenues.

      Best to you sir!

      Mark C

  • Leonard Suskin

    What is a Codec?

    In our world, We use the word in two senses. The first is the encoding scheme for video and audio. H.264, H.265, JPEG2000, VP9 are all Codecs. Those will continue to find use as increasingly dense video information needs to be transmitted, stored, and retrieved.

    The second sense, the one you are using, is a hardware-based video-conference appliance. We can ask the next question: what is a hardware-based video-conference appliance? It is, at its heard, a purpose-built single-function computer.

    The question isn’t “Codec or no Codec” — we always will need to encode and decode. The question isn’t even “hardware or software” — a video-conference appliance is, at its heart, an engine for running software. The actual question is “single-purpose hardware or general-use computer?”

    The arguments for the former are becoming less compelling to me as computing power becomes less costly and general-purpose computers are more capable of handling these tasks at very reasonable costs.

    It is an interesting discussion. I don’t have a crystal ball, but I think that the Mark Coxon of two years ago had it right; the days of single-purpose hardware are, at this point, numbered.

  • mjgraves

    In reality, the “codec” hardware you reference was traditionally DSP-based. In modern times it has transitioned to software on more general purpose hardware. That can be an appliance or a PC, it’s just a question of form factor. The big difference being that most modern approaches are fundamentally extensible in software.

    I think that WebRTC could be especially interesting in this area. The availability of the data channel provides a secure pathway for any kind of auxiliary data that might be required by the additional hardware.

  • Mark Coxon

    mjgraves and Leonard,

    NIce to see two sharp minds at work here in the comments. I appreciate the interaction!

    I love your take here and honestly don’t know as much about WebRTC and the data channel as I should, (time for some self study).

    Leonard and MJ,

    I am thinking of this from a liability and reliability perspective, and in that case would you really want a general purpose piece of hardware running other software for different applications that could crash the system?

    Imagine you engage a Tier IV data center to provide maximum uptime and redundancy for your computing and data needs. The Data center agrees to provide that level of availability. Do you think they’d let you provide the switches, servers, or the SAN devices? Of course your devices may be able to do the job, but again they may not, so of course the center would never provide any guarantee of up-time or availability given this scenario.

    Move to the telesurgery example. If you build a robot and control software for it to do life and death procedures, would you allow your software to be installed on a PC that is running other software that may crash your program? There is little chance that would happen.

    You would most likely require a tested device, dedicated to the task at hand, where the predictability of failure can be assessed.

    I would even argue that a company like this would most likely also control the specifications for the display being used as well given the sensitivity of the procedure and potential impact of what a pixel error or poor resolution would mean.

    Think of scenarios in AV where the stakes are much lower. Crestron traditionally won’t troubleshoot systems that don’t use their cables. Magenta Research had 3 or 4 CatX cables that could be used with their low-skew transmitters and receivers, otherwise all bets were off and they guaranteed nothing. And this was for video transmission, not microsurgery.

    I definitely agree that most of our traditional purpose built hardware in AV (especially DSP, VTC, Control) may move to reside on general purpose machines as applications rather than appliances. I’ve written that in several places and stand by it.

    However in these scenarios, I don’t think “good enough” cuts it. It needs to be nearly perfect, which means dedicated, purpose driven hardware may be the only answer. BYOPC is not a winning strategy when the stakes are that high. Just my take.

    Thanks again for all the great comments!

    • mjgraves

      Oh, it needn’t be BYO by any stretch. I imagine that the medical vendor community has policy on that front. As you cite with Crestron, they don’t want to support what they don’t control.

      The problem with dedicated hardware is time. It takes along time to design & build. Then it isn’t as readily upgraded in-the-field.

      The debate about appliances vs computers is ages old. I spend 20 years in broadcast graphics, installing & integrating dedicated hardware systems. Prior to 2000 true dedicated hardware ruled. It was the only way to ensure real-time performance.

      From 2004 onward these were all Windows PCs with custom built DSP & FPGA-based accelerators. These days they’re all PCs with an nVidia card and some BlackMagic I/O. No-one makes custom hardware anymore.

      • Mark Coxon

        MJ, great perspective from someone who’s done it!

    • Leonard Suskin

      I’d expect a virtual machine running on a standard server rather than either an appliance or a desktop PC. This is a configuration to which IT administrators are accustomed.

      In fact, I’d think that said IT manager would rather have a server on their network than one of our special AV “mystery boxes”.

      • Mark Coxon

        Leonard, I think we largely agree.

        A dedicated machine running specialty software is key to reliable function, and these machines can’t be tasked or loaded with other applications that compromise performance. Whether we call it a server or call it a hardware codec, doesn’t really matter IMHO.

        I also agree that IT directors would 100% prefer that to a mystery black box!

  • Bryan

    As being in the minority of people shaking their little fist saying “hardware is not dead”, I feel that I need to chime in on discussion. In the past year, the only real “switch” I’ve seen from a traditional hardware codec to a software one is migration to Lync via an appliance which could really be considered a hardware codec (RL 2 for instance). Then again, our product is geared toward the Fortune 500 and I don’t talk to many SMB’s out there. They mostly want conference rooms (another thing the pundits call dead) equipped with either Cisco or Polycom codecs. These companies are still wanting to project their executives in the best manner possible and to replicate a face to face meeting as best as they can. Software does not do that (yet). It’s getting better and I find myself on more and more software based calls even though I have codecs from Polycom, Cisco, Lifesize and StarLeaf sitting around my office.

    • Mark Coxon

      Great perspective Bryan. I’m glad to hear that you are able to maintain sales in a hardware based VTC market. I can tell you that it seems your experience is not the norm at this point however just based on the numbers. Wider adoption at lower total revenues means less hardware is being sold.

      Thanks for jumping in though! It’s good to hear some still care about quality!