THE #1 AV NEWS PUBLICATION. PERIOD.

On Digital Assistants

An AV consultant once had a conversation with an executive from manufacturer about the most expensive home he’d seen. He asked what kind of control system one would build for a ten-million dollar home. The executive answered that there’d be a single button. When the button was pressed, a servant would walk into the room and ask, “How may I help you?”

That’s not my story, but one I was told quite a few years ago now. There’s a fair bit to say about what we’re selling, what control and automation are and whether or not technology will catch up to the simple human element of an assistant who knows what you want and how you want it. What brings this anecdote to my mind today are two events: Crestron’s announcement of integration with Amazon Echo and my own acquisition of a Google Home smart speaker.

First, the obvious for those who’ve not tried it: Voice control is becoming far, far better than it once was. Not only do these devices recognize a voice in normal conversational tones, but Google even has a feature to distinguish between different people’s voices. If I say, “OK, Google, what’s on my calendar today?”, I’ll get MY calendar, while my wife or daughter will get their own Google calendars. If you’ve followed me for any length of time, you’ll know that devices as gateways to ecosystems is one of my on-going themes (and if you’ve NOT followed be for some length of time, why not?). The growing crop of smart speaker devices are no exception. Google gives you Google search, VoIP calls via your Google contacts, the aforementioned Google calendar, access to your Google music playlist, integration with Google’s Chromecast streaming device. Amazon Echo gives you access to your Amazon content, integration with your Amazon Fire streaming devices.

So digital assistants are gateways to ecosystems. Is that all that they are? Can they become that assistant for the rest of us, one call away and ready to do whatever it is we ask? It’s not a good idea to say “never” in terms of technology, but this time we must say, “not yet” – and, perhaps, “yet” is far enough away that it may as well mean “never.” I also wonder if that’s where we’d want to go.

I’ve lived with one of these for just over a week now and find myself speaking to it several times a day — mainly setting reminders for things to do later, setting timers (winter is baking season!) and asking Google questions. The latter is, while the simplest, the part that most feels like science fiction. Five decades ago we imagined James T. Kirk, captain of the Enterprise, in possession of a ship’s computer. When he needed something from it he didn’t fiddle with a touchscreen or keyboard or switches and dials; he spoke to it, asked it questions in plain English to which the computer would speak an answer. It isn’t quite conversation, but it is getting close to it.

A digital assistant still isn’t the human assistant hired by members of the one percent to fulfill their every need. To give an analogy from the world of science fiction, the Star Trek computer, like digital assistants of today, has for decades been depicted at being wonderful at doing what it is told. It executes commands. It answers questions. Given some third-party hardware, “what it is told” can easily expand from “tell me when Ray Bradbury died” to “turn on the kitchen lights” or “set the temperature to 67 degrees” or “place a video call to the London office.”

That part is special and amazing and we’ve accomplished it. That part that’s missing is the one we see in the next Star Trek series, decades later when an artificial man named Data serves on the crew of the new Enterprise. Like the ship’s computer, Data speaks in plain English. Unlike the computer, Data takes initiative. In some ways this is a step back from today’s digital assistants; he’ll not always unquestioningly obey an instruction. On the other hand, he IS capable of showing initiative, of improvising, and of doing those things too complex for even the most carefully written algorithms.

That, of course, is the first part of what we’re missing: initiative. It’s easy to tell your digital assistant to turn on the lights, just as easy to schedule the lights to be on and coffee made at 5 a.m. each morning (I wake up early). It isn’t even that hard to program an exception – coffee at 5 a.m. except on New Years Day, or unless you’re traveling. Or if you had the stomach flu. The problem is that today creating these exceptions requires the user to think ahead of all of the possibilities and manually create rules and exceptions to the rules and exceptions to the exceptions. Invariably things will be missed which no human assistant would get wrong. That’s the first thing we’re missing.

The second is darker — the sense of power over another human being. Having a servant whose livelihood depends on your happiness cements the fact that you have a higher social status than they do. That your happiness is more important than theirs. It’s a way to separate those in-power from those needing it. This is something no device can replicate.

Or can it?

Look at the world of Star Wars. Droids in some ways fill the role of fancy digital assistants. They’ll do your work for you, they’ll translate languages for you, they’ll even fight for you. What’s more, they’ll be owned by you. You can buy them, sell them, even erase their memories. We have a word for the buying, selling and ownership of thinking beings. It’s a word Star Trek used when it was suggested that the aforementioned Data was property and, thus, could be disassembled and experimented on.

Slavery.

We started with a technical question and a user experience question, but we’ll end with a philosophical one: If we created the control system we really wanted, would we in essence be creating people? And if we wanted those to serve at our pleasure, would we be creating a race of slaves and handing them the keys to the very workings of our entire lives?

This isn’t something to which we are close, but perhaps we still should ask ourselves – is this what we want?

I fear that the answer might be “yes.”

Top