David Burke says the next wave of Android smartphones will tackle one of the agonies of modern existence: Copying text from one app and pasting into another.

Burke is the Google vice president of engineering who oversees Android. This week, at the annual Google I/O conference in Silicon Valley, he unveiled Android O, the latest incarnation of the world’s most popular mobile operating system, and part of his pitch was that was that new OS could remove the extreme pain from cut and paste. Android O will automatically recognize and highlight names, places, and addresses in email messages and other text, he said, so you can copy and paste them without struggling to drag those horribly small and ridiculously uncooperative little arrow thingies from word to word. As seen on the big screen behind him, Burke double-tapped on the words “Old Coffee House” in an email, and Android highlighted all three of those words—no more and no less.

That feat drew a very real round of applause from the hundreds of coders and other tech heads gathered at the Shoreline Amphitheater. But it was merely the teaser for so many other things to come.

The new copy-and-paste tool is driven by a deep neural network that runs right there on the phone—an AI service trained to recognize things like names and addresses. And that’s not an easy thing to pull off. Typically, these kinds of AI algorithms, including those that identify objects in photos or recognize commands spoken into smartphone digital assistants, must run in some massive data center on the other side of the internet—meaning you can’t use them unless you have a wireless signal. But now, Google is building a software engine for smartphones suited to running AI without help from some distant data center.

This is called TensorFlow Lite, a streamlined version of the open source software engine that drives neural networks inside Google’s data centers. According to Burke, it’s designed to be “fast and small while still enabling state-of-the-art techniques”—namely, the new wave of neural networks that are so rapidly changing the way companies are building and operating online services. Google won’t say much more about this new project. But it has revealed that TensorFlow Lite will be part of the primary TensorFlow open source project later this year, meaning it will soon be available to the worldwide community coders. The idea is that so many other, outside Google, will build neural networks that run there on the phone, driving tasks well beyond image recognition, speech recognition, and cut and paste.

Google isn’t the only one working to push neural networks beyond the data center. Facebook is already using neural networks to add Snapchat-like filters to photos without calling out over the internet, and the company has released an open source software engine that’s similar to TensorFlow Lite. The difference is that Google controls an enormously popular smartphone operating system. It can distribute this kind of mobile tech much more quickly—and to a far larger audience.

Phones That Learn

Even with these software engines, running neural networks on a phone is a questionable undertaking. They can sap processing power and battery life. But both Facebook and Google are encouraging chip makers to build processors specifically suited for this kind of arithmetic. In unveiling TensorFlow Lite, Burke also said that he and his team are also building hooks into the Android’s code that can tie into such chips.

Companies such as Intel are already working on this kind of mobile AI processor. And Burke says Google will encourage chip makers to build mobile chips that can not just run neural networks but train them. Before it can identify a dog in a photo or the name of a coffee shop in an email, a neural net must literally learn how to do it by analyzing vast amounts of data. To learn what a dog looks like, for instance, it must analyze millions of dog photos. Typically, this training also happens on machines tucked away in data centers. But Google believes AI systems based purely on phones will eventually start to learn for themselves, as well.

That effort will take even more doing. But Google has clearly committed to this vision of AI on the phone. At I/O, the company also unveiled a custom-built chip for both training and running neural networks in its data centers. I asked Google CEO Sundar Pichai if the company might build its own mobile chip along the same lines. He said the company had no plans to do so, but he didn’t rule it out either. “If we don’t find the state-of-the-art available on the outside,” he said, “we try to push it there.”




[WIRED]