X

Hey Google, stop trying to make Assistant my friend

Machines are not people, too.

Jessica Dolcourt Senior Director, Commerce & Content Operations
Jessica Dolcourt is a passionate content strategist and veteran leader of CNET coverage. As Senior Director of Commerce & Content Operations, she leads a number of teams, including Commerce, How-To and Performance Optimization. Her CNET career began in 2006, testing desktop and mobile software for Download.com and CNET, including the first iPhone and Android apps and operating systems. She continued to review, report on and write a wide range of commentary and analysis on all things phones, with an emphasis on iPhone and Samsung. Jessica was one of the first people in the world to test, review and report on foldable phones and 5G wireless speeds. Jessica began leading CNET's How-To section for tips and FAQs in 2019, guiding coverage of topics ranging from personal finance to phones and home. She holds an MA with Distinction from the University of Warwick (UK).
Expertise Content strategy, team leadership, audience engagement, iPhone, Samsung, Android, iOS, tips and FAQs.
Jessica Dolcourt
3 min read
James Martin/CNET

Google wants to trick you into thinking machines are human.

That's a theme the tech titan pushed as it rolled out new features for its Google Assistant voice software at Google I/O, Google's annual conference for developers. Google Assistant is in nearly every Android phone, the Google Home smart speaker and drives future smart displays, which means that robo-voices that sound and act more human possibly than any you've ever heard before will live all around you. And that's exactly what Google wants.

You'll also be able to make your kids say "please" before the Google Home responds. Soon you can select singer John Legend to be your new Google Assistant voice. And the oddly-named Google Duplex makes digital voices as lifelike as they've ever been.

AI is inescapable. It's the future. But here's my issue: Google Assistant is simply trying too hard. It's too chummy. Say "Good night" to your Google Home and it triggers your bedtime routine (like turning off the smart lights and adjusting the thermostat), and then responds with "Let's get ready for bed" and "Sleep well".

Open Google Assistant on your phone and it cheerily greets you: "Hi, I'm your Google Assistant. I'm here to help you." Ask it who it is and it tells you, "I'm a friend."

google-assistant-friend

Yes, but no.

Screenshot by Jessica Dolcourt/CNET

That's the thing. I don't need to be friends with my technology. I don't need it to greet me by name, wait for it to respond or pretend to know me. I just want it to turn off the lights or open a website… silently.

For me, Google Assistant is solely a tool, not a social experience. And that's a key point Google seems to miss as it falls over itself to own the AI experience we'll most certainly grow to depend on.

Just look at how the internet reacted to Google's eerily lifelike Duplex software, which sounds human enough to fool real humans as it calls real people to make appointments on your behalf.

Google has since clarified that Duplex's robot voices will identify themselves while making calls, which removes our initial ethical question. However, early uproar still unequivocally points to the fact that plenty of people just don't want to treat machines like flesh and blood.

At last week's I/O conference, Google funneled its desperate thirst to create the need for human-like AI and fill it straight to the developers who will make a lot of the apps that use the (admittedly impressive) Google Assistant technology.

"Imagine what it's like to have conversations with technology," Saba Zaidi, a Google interaction designer, told developers during a session at the show.  

Zaidi went on to urge developers to build Google Assistant's conversation off of real human interaction. "Try to observe the relevant conversation in the real world," she said, noting that "Not everything that does well in your app does well in human conversation."

And if that's a challenge for developers, Google even went so far as to create a tutorial of best practices for structuring conversations, called, I kid you not, Conversation 101.

Of course, designing a conversation for a computer program to simulate real human exchange is different than engaging in one yourself. The fact that Google feels it needs to pass out pointers perhaps suggests the crux of the problem.

Maybe instead of trying to make machines have meaningful conversations with human beings, Google Assistant should butt out and let real people focus on having meaningful conversations -- with each other.

The coolest things we saw at Google I/O

See all photos

Read: The 5 best things from Google I/O this year

Read: Google Assistant could become the most lifelike AI yet