Have you ever been abducted by an alien? No? Well, me neither, but I think we’ve all heard the tales. One night, while out in the woods, a light shines from above. Someone looks up to be blinded by the light and instantly levitates off of the ground and is pulled into the ship, where little green “men” run various tests and observe their subject before releasing them back to their own species just in time to get on the 11 o’clock news.
The stories are entertaining at the very least, and with over 70 sextillion stars in the visible universe, many of them are surrounded by multiple planets, many argue that there has to be extraterrestrial life. I can’t say conclusively that there is or isn’t life out there, but I can guarantee you that all of these abduction tales are in the highest likelihood, false.
How can I be so sure? The description. Little man, two eyes, a nose, a mouth, two legs, two arms, walks upright. It defies the very definition of the term “alien.” Odds are that extraterrestrial life would be so different from that of life on earth that we may not even be able to perceive it.
According to Lord Martin Rees, president of the Royal Society and astronomer to the Queen,
“They could be staring us in the face and we just don’t recognise them. The problem is that we’re looking for something very much like us, assuming that they at least have something like the same mathematics and technology.“
You see, the discovery of new life most likely won’t look anything like what we know and understand life to be currently. We have a very specific paradigm we see life through, and that limits our ability to see anything else. Just because the most intelligent life form on this planet has a symmetrical body, two eyes, two legs, and is a biped doesn’t mean that those traits are essential or common to other intelligent life forms that would inhabit much different environments than we do.
So how does this all relate to technology? In exactly the same way.
Now there are a ton of people out there that purport that Moore’s Law, the fact that transistor count and computing power doubles every two years, dictates that we will have rapid innovation in the AV space as well. I disagree based on the lack of evidence. In fact, I will go back to the alien life example above.
If the probability of extraterrestrial life is so high based on all of the parameters I set forth above, where is it? Why do we have no concrete proof? This dilemma even has a name called Fermi’s Paradox. You see, mathematical probability, no matter how high, doesn’t always beget reality.
We have this same Fermi Paradox in AV. If Moore’s Law says we double transistor count every two years and with it computing power, where is all the innovation?
Here’s my take. I can put one engineer with little creativity in a room and he will produce nothing new or innovative. I can then add three more engineers with little creativity into that same room, effectively quadrupling the “computing power” and still get nothing new. Just because the computing power quadruples, innovation isn’t spawned automatically. There needs to be a creative spark, typically spurred by asking a question in a new and unique way.
This is why AI and machine learning are so overstated. Sure, big data can show us trends to help us optimize the systems we already have in place, I don’t deny that. However, big data isn’t going to solve our creativity problem.
In fact, if you’ve read Gladwell’s Blink, he makes a compelling case example by example that thin-slicing of big data can cause paralysis and over confidence leading to undesirable outcomes. There is an intrinsic benefit the human gut gives to our decision making, and one creative, motivated individual can beat big data and its analysis even against overwhelming odds.
The next big thing will come from human intuition and connection, and the unique ability of a person to approach a problem from a different, never before anticipated angle.
The next big thing isn’t going to look like the last big thing, but unfortunately our own ideas about what technology looks like today, inhibits our ability to produce real innovation at a rapid pace.
I once heard video conferencing evangelist Simon Dudley say that dramatic change of this nature usually “hits you in the back of the head.” Given that, I can’t say with certainty what the next big thing is, but I can say what it isn’t.
In video it isn’t Ultra High Definition. It isn’t High Dynamic Range. It isn’t enhanced color space, or even deeper bit depth, or full sub-sampling, or dare I say, as cool and beautiful as it is, transparent OLED. These are all worthy endeavors that will make amazing improvements to today’s displays, but they won’t change the face of AV as we know it. Just like 1080p, 3D, autostereoscopic, and transparent LCD failed to do so in the past.
If you think back, the last real transformative change to displays was going from the CRT to the flat panel, and even the flat panel didn’t gain steam until it offered a size larger than the CRTs. In 1995 Fujitsu’s 42″ plasma finally offered a larger, thinner alternative to the Sony 40″ HD XBR CRT, making plasma a desirable option, despite the dramatic jump in cost at that time. Plasma displays as a concept had been around since 1936, and became a monochromatic reality in 1964, so it was no overnight game changer to say the least. It took 60 years to manifest.
Think of Apple as another example. Apple is thought of as the king of the innovation crop, but consider the fact that the iPhone was launched in 2007. Sure we’re on version six now, but each iteration is about adding a new camera, adding more storage, etc. Someone transported magically from 2007 to today could pick up an iPhone, know it’s an iPhone, and navigate its use easily and fairly comprehensively in a matter of seconds. The iPhone hasn’t substantially changed in nine years, and the iPad, introduced in 2010, hasn’t changed substantially either in the last six.
Yet somehow, we think that adding an app to control our latest black box is “game-changing” cutting-edge stuff. It was when Savant did it in 2009, introducing a remote that was literally an iPod on top and a remote on bottom. It was also big news when they ditched their proprietary touch panels altogether in 2010/11, leveraging the newly released iPad for control instead. But today? It’s a given that you should have an app, not an innovation.
I wrote a few months back that the “app” is already dead, and I stand by that. In fact, looking at tech trends for 2016, everyone from Fast Company to Gartner to Microsoft agrees that we are shifting from a GUI to Zero UI or Ambient User Experience. So as you’re developing the new app for release next year thinking it will be the next big thing, it won’t be. (Sorry.)
The next big thing in video will have to break through the rectangle and shatter the 4th wall. The next big thing in applications will have to rely more on wearables, embedables and sensors as opposed to touch screen GUIs. The next big thing in unified communications will have to allow for physical interaction with real or virtual objects and be so lifelike as to not be perceptibly different than a face to face meeting.
Oh sure, we’ll hear about the small improvements we continue to see being promoted as “the next big thing,” but they most likely won’t be. Instead, just like Xerox overlooking the PC, IBM overlooking Windows and Kodak overlooking the digital camera, the next big thing will appear out of apparently nowhere and be so different than the current status quo that most people won’t see it for what it is until it becomes the one thing everyone now must have.
So, I will again reference one of my favorite blog posts of all time as a call to action for those in product development in the AV space:
“Where are you strange thinkers? Where are you weirdos? For god’s sake, get weird. Do different… PLEASE… the fate of our ecosystem rests in your hands…in your mind lives the step function we desperately need… inspire us!”