It was possibly the most mind-blowing tech demo in years: During the opening keynote of the Google I/O developers conference, CEO Sundar Pichai showed the company’s AI-driven Assistant making a phone call to a business and carrying out a verbal conversation with the person who answered.
What made the demo of the feature, called Duplex, so amazing was the Assistant’s command of natural language – saying “um,” “mm-hmm,” and “ah” at various times – was so masterful that it was apparent the person on the other end had no idea he or she was talking to a machine. It was a very specific situation, but Google Assistant had effectively passed the Turing test.
SEE ALSO:Google’s latest assault on Apple proves how far behind Siri really isThe demo instantly got the tech world’s collective mind a-blazing. Was the Assistant obliged to say it wasn’t a person? How off-track would the conversation have to go for the Assistant to mess up? What happens when an automated system picks up the phone? And maybe most important, when will consumers get their hands on this feature?
To try to answer many of those questions, I called up Shane Mac, CEO of Assist, a voice and messaging assistant platform. Mac has been immersed in voice technology and AI for a long time, and he and I had a far-reaching discussion about Google Duplex on the MashTalk podcast, exploring Google Assistant’s impressive progress, the roadblocks ahead, and what happens when this technology scales.
Given the current climate -- where Silicon Valley companies are routinely criticized for how they seem to prioritize products and progress over customers -- some saw the Duplex demo as another example of Silicon Valley hubris, effectively using service workers as guinea pigs.
Tweet may have been deleted
“There are so many questions in this arena,” Mac said. “Bot ethics are finally real. If [this tech] is applied somewhere, and you don’t know it’s a real human, there are massive things that need to built around legal policy, disclosing if it’s a robot or not... because you can see this quickly becoming a really big problem.”
When Google Assistant users get their hands on Duplex, the amount of robocalls -- in both directions -- will certainly increase. Waiting on hold will effectively no longer be an issue for someone with the tech, and sales forces may need to add another layer of human verification -- a kind of verbal Captcha -- to ensure they don’t get flooded with calls from bots.
But what about the strange but obvious consequence of tech like Duplex: Where a growing portion of phone calls is just bots talking to bots?
“Bots are going to talk to bots all the time,” Mac predicted. “Everyone’s trying to get to these fully automated experiences on the brand side. But what if the consumer bot gets here first? Then what happens?”
SEE ALSO:Amazon plans to give Alexa a 'memory'At the end of the day, though, this is one tech demo from just one of the many companies aggressively pushing forward into voice tech and digital assistants. This same week, Microsoft showed how its assistant, Cortana, would talk to Amazon Alexa.
It was a promising taste of platforms working together, but it also emphasized what the individual platforms are good at, with Alexa handling personal requests and Cortana having domain over more work-related things like calendars and work projects. To Mac, however, this kind of specialization might mean social networks have something new to offer, underpinning the strategy behind Facebook’s rumored smart speaker.
“This is where Facebook has an opportunity,” said Mac. “I would watch out for the social platforms getting into voice, because social experiences drive human nature, and it can do things with that social graph that other platforms can’t.”
You can subscribe to MashTalk on iTunesor Google Play, and we'd appreciate it if you could leave a review. Feel free to hit us with questions and comments by tweeting to @mashtalkor attaching the #MashTalk hashtag. We welcome all feedback.
TopicsArtificial IntelligenceGoogleGoogle Assistant