Most of the artificial intelligence applications in the consumer and business world involve the digital world communicating with itself, whether it’s the sensor in a car telling the steering column to make a course correction, or warehouse logistics software that knows when supplies are running low, and ordering more. And when humans do get involved, they’re usually initiating the questions of tools like Siri or Alexa. And often, the results are less than perfect.
But at the recent Google I/O conference, the tech company unveiled an AI-driven tool called Duplex that has a lot of people’s heads spinning, and that takes human-machine interaction to a new level.
You can listen to a sample here:
Digital savant Omar Gallaga of the Austin American-Statesman’s 512 Tech says one thing that’s different about the technology Google demonstrated, and the way in which it differs from voice assistants you may already have, like Alexa, is that the machine initiates the conversation. And it sounds almost exactly like a real human.
What you’ll hear in this segment:
–What ethical concerns this technology raises for human communication
–How Google says it will make it easier for humans to know when they’re communicating with a machine