When David Limp thinks about the future of Alexa, the AI assistant he oversees at Amazon, he imagines a world not unlike Star Trek—a future in which you could be anywhere, asking anything, and an ambient computer would be there to fulfill your every need.

“Imagine a world in the not-so-distant future where you could have infinite computing power and infinite storage,” Limp said today at WIRED’s 2017 Business Conference in New York. “If you take off the constrains of servers and building up infrastructure, what could you do?”

Limp, who has worked at Amazon since 2010 as the senior vice president for devices, sees Alexa as a critical part of this future. Already, you can shout “Hey, Alexa,” and get the assistant to tell you the weather forecast, turn off the lights, hail an Uber, or thousands of other things that Amazon and developers have trained it to do. But Limp says there’s still plenty more work to be done before we live in the AI-assisted future he thinks about every day, and much of that effort has to do with training machines to better understand humans.

Since Alexa made its debut in 2014, the virtual assistant has taken lease in dedicated devices like the Echo, Tap, Echo Dot, Echo Look, Echo Show, as well as dozens of other supported devices. All that interaction with humans has given Alexa plenty of voice data to parse through—data that’s helped train the assistant to understand preferences, recognize different accents, even figure out the intent of a request without specific keywords. A year ago, if you’d told Alexa to “order a car,” it wouldn’t have understood what you meant. (What, like, order one from the Amazon Marketplace?) Now, through improved machine learning, Alexa knows what you mean and will prompt you to enable an Uber or Lyft skill so that it can summon your ride.

Of course, Alexa is far from perfect. Limp says one near-future goal would be improving Alexa’s understanding of anaphora—so if you ask, “Who’s the president of the United States?” and then follow up by asking, “How old is he?” Alexa knows you’re still talking about Donald Trump. Amazon is also tinkering with Alexa’s short-term and long-term memory, so that the bot can recall context from yesterday’s conversation as well as the thing you asked it five seconds ago.

Those changes involve a shift toward making devices that aren’t personal but can work for everyone. Think more like a wall clock in the kitchen, which everyone in a household can glance at to get the time, rather than a smartphone, which is designed for one person to use.

“As we design the interfaces for Alexa, whether voice or graphical, it’s about making it ambient and so that anybody can use it,” Limp said on stage. “If you ask for a timer and I ask for a timer, they’re both going to work.”

In a world where devices will surround people all the time, those gadgets will have to understand what humans mean, however they choose to say it. For anyone who uses Alexa, that education is already under way: Every time someone talks to their Echo, the world inches a little bit closer to that Starship Enterprise future Limp imagines.




Wired