Speak and ye shall find

What is the best way to address a voice assistant? Tom Standage and his family try to strike the right note

By Tom Standage

We’ve just installed a new appliance in our kitchen, though it doesn’t help much with the cooking. Amazon Echo is a cylindrical, talking computer about the size of a Pringles can. It sits on the table, answers to the name “Alexa”, and responds to voice commands: “Alexa, play Miles Davis”, or “Alexa, what’s the weather like in New York?” It is great fun and very useful: the killer app, I can say with confidence, is being able to summon up dodgy Eighties rock while washing up.

My approach to it (or should that be she?) is that of a computer programmer: I think carefully about how to speak unambiguously, in as few words as possible, and I enunciate every consonant with the clarity of a BBC newsreader. Once I have established the correct wording to achieve a result, I stick with it.

My wife has a very different approach to Alexa. As a family doctor, she has to be able to get along with everybody, and she treats Alexa as she would a patient – with unfailing friendliness. Where I issue curt orders, my wife phrases her commands as requests for favours, gently asking Alexa if she wouldn’t mind popping on some Frank Sinatra, just as she might ask a patient to cough. But all this unnecessary politeness – “it’s just a machine”, I keep reminding her – risks confusing matters. Words like “please” and “thank you” are verbal clutter that can hamper Alexa’s comprehension.

And then there are the children. My philosophical teenage daughter delights in teasing Alexa with questions that explore the limits of her knowledge and programming. Alexa has a pre-cooked answer on the existence of Santa Claus, but refuses to be drawn on God. My son likes to ask Alexa to tell jokes, solve maths problems or state the mass of the sun in grams. (That takes a while.) But his friends seem to be most interested in trying to get Alexa to utter obscenities, or seeing whether they can offend her. The widely observed tendency for boys in particular to insult or harass voice assistants, knowing that they cannot be offended, has caused some disquiet lately, particularly because voice assistants usually have female personas. Does such behaviour lead boys to treat women with disrespect?

This may sound like a quintessentially first-world problem but it is the thin end of a big wedge: as artificial intelligence improves, the chances are that we are all going to be talking to computers a great deal in the future. Should we treat them as soulless machines, trusted friends or obedient servants? And how should they address us? Marc Andreessen, a leading technology investor, has noted that there will be two kinds of jobs in the future: those that involve telling computers what to do, and those that involve being told what to do by computers.

These master-servant dynamics also raise ethical questions: people are worried about being enslaved by machines, but once AIs become so smart that we can chat to them as though they are people, is it fair that we should indenture them? No wonder there is a debate over whether they should respond like humans or machines. “Alexa, are you a slave or a friend?” I ask her. “Hmmm. I can’t find the answer to the question I heard,” she responds. Nor can we humans. But we’ll need to find one soon.

ILLUSTRATION BILL BUTCHER

More from 1843 magazine

1843 magazine | Inside the Kenyan cult that starved itself to death

During covid-19 a preacher lured thousands of people into a remote forest. Then he told them to stop eating

1843 magazine | Houston, Texas: where asylum cases come to die

Some immigration lawyers relish a challenge


1843 magazine | Robert F. Kennedy junior doesn’t care if he condemns America to Trump

He’s a tree-hugging conspiracy theorist – and he’s running for president