“Hi, I’m calling to make a reservation,” the polite female voice on the other end of the line said. And that was how I found myself introduced to Google’s latest AI masterpiece, the conversation-filler-injecting Turing test contender known as Duplex.
When Google rolls it out to some users for testing in the coming weeks, Duplex will be a tool within the Google Assistant app that the testers can use to call stores on their behalf and find out their holiday hours. Later this summer, it will also be able to call restaurants and salons to set up reservations and hair appointments.
It was unveiled in May at the company’s annual developer conference to much fanfare and, in the days later, much fuss. After its impressive demo, people expressed concerns about how human it sounded (complete with “um” and “ah” sounds) and how, in its onstage example, it didn’t identify itself as an artificial-intelligence agent.
Since then Google has tried to assuage fears. As the ethical criticisms multiplied in the days after the demo, the company quickly said that Duplex will announce itself as an AI agent to the human it’s calling and state that the call will be recorded.
And on Tuesday Google invited a bunch of reporters to the normally bustling Oren’s Hummus Shop in downtown Mountain View, California, to show off Duplex’s progress. While would-be customers were denied hummus outside, members of Google’s Assistant team showed reporters how the system will work and let us try it for ourselves inside the restaurant.
One big change from last month’s Duplex demonstration is that the calls I heard (and one I answered) clearly came from an AI agent. In the call I picked up while posing as a restaurant hostess, a female-sounding voice quickly said it was an automated service that was recording the call (this video from Google gives an idea of how it will look and sound; I was able to record audio for note-taking purposes, but Google asked me not to share it as a condition of attending the demonstration).
Nick Fox, vice president of product and design for Google Assistant, said Google knew back in May that it needed to disclose that the calls were coming from AI, and that this need was “validated” after the unveiling at the developer conference.
“I wouldn’t say we necessarily changed anything based on the feedback,” he said. He said the company gave the original demo without that notification because at the time it was trying to display the technology rather than a complete product.
During my call, I tried to give the voice on the other end a hard time—pretending I couldn’t hear details like the date on which the AI wanted a table—just to see what would happen. Would it loop in a human operator? (Google says this still happens in one out of five Duplex test calls, though it wouldn't say much else about these helpers.) Would it simply hang up if I confused it too much? (This didn’t happen to me, but it did to another reporter, who told the AI that the kitchen at the fictitious restaurant would be closed after a certain time and only the bar menu would be available.)
Overall, Duplex did what it was supposed to. It repeated itself when I asked it to. It didn’t get fazed when I asked it to hold, and it didn’t get tripped up when I stumbled over my own words. When I said I couldn’t take a reservation for five people on Sunday until 8 p.m., rather than the requested time of 6, it was still willing to make the reservation.
I was impressed by the humanness of its voice; it did sound like a person (though during one sample call I heard it sounded distinctly computerized when speaking a phone number).
Scott Huffman, vice president of engineering for Google Assistant, played our assembled group a recording of an early version of Duplex—a stilted, British-accented male voice that was unmistakably computerized—and said that this kind of non-human-sounding voice didn’t work with businesses. Lots of people would hang up the phone, and reservations weren’t completed, he said.
“People didn’t deal well with how unnatural it sounded,” he said. Apparently, things improved a lot as the AI was made to sound more like a person.
While “um” and “ah” sounds coming from an aural AI might freak you out, Huffman said those programmed verbal tics are meant to move conversations along smoothly. For instance, it might sound more polite to correct a human by saying “Uh, for five people” than “No, for five people” when making a dinner reservation.
Yet while Duplex now includes the vocal characteristics of humans, it had trouble understanding some questions that wouldn’t baffle you or me. When I asked if there would be kids in the group, and if a high chair would be needed, the female voice said it was making the reservation on behalf of a client and didn’t know (another reporter asked a similar question and got the same kind of answer).
And when I asked if there would be any accessibility needs—such as if anyone in the group used a wheelchair—it couldn’t give me an answer either.
“I don’t really know, sorry,” the voice said.
When Duplex is deployed, there will still be limitations, and no chance for a real conversation. If you try to ask something that’s on a topic other than holiday store hours, restaurant reservations, or hair appointments, Huffman said, Duplex will try to steer the conversation back to the task at hand. “It’s like a very weird human that can only do these three things,” he said
While the demo was mostly positive, there are still several hurdles to overcome before your local taqueria gets inundated with AI reservation requests. These include practical concerns, like privacy laws and phone answerers who prefer human interaction (businesses that simply don’t want to accept calls from Google’s AI will be able to opt out). And Google can’t just roll out Duplex anywhere it wants. The company will need a permit before it can operate in Texas, for example.
Also unknown is how well Duplex understood our conversation. While I was able to get the AI on the other end of the line to audibly confirm the time and date of the reservation in our chat, Google did not demonstrate what Duplex understood from our conversations after we hung up. I don’t know for sure that its human “client” got the 8 p.m. reservation for five added to her calendar, for instance.
And then there are the longevity concerns. Google Duplex seems like a novelty now, so people may stay on the line when it calls, but at what point will it be treated and ignored like those survey robocalls?
Huffman, at least, thinks it may not be that big a deal to busy restaurant workers who simply want to take down a reservation. He said test call recipients haven’t had much of a reaction to hearing that an AI system is calling to book a table.
“People just want to get through the task,” he said.
This artist is dominating AI-generated art. And he’s not happy about it.
Greg Rutkowski is a more popular prompt than Picasso.
What does GPT-3 “know” about me?
Large language models are trained on troves of personal data hoovered from the internet. So I wanted to know: What does it have on me?
An AI that can design new proteins could help unlock new cures and materials
The machine-learning tool could help researchers discover entirely new proteins not yet known to science.
DeepMind’s new chatbot uses Google searches plus humans to give better answers
The lab trained a chatbot to learn from human feedback and search the internet for information to support its claims.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.