It’s About to Get Way, Way Easier to Put AI Everywhere
Google has a vision for a world full of cheap and tiny smart devices—and it hopes its software will power them all.
A couple of years back, Google launched an open-source machine-learning software library called TensorFlow. It has since exploded in popularity, to the point where it’s now used by the likes of Airbnb, eBay, Uber, Snapchat, and Dropbox to power their AI development. Its appeal is obvious: it allows relative beginners to build and train neural networks without needing a PhD in artificial intelligence. As a result, the library now forms a major component of Google’s business plan. The company has also produced a slimmed-down version called TensorFlow Mobile, designed to shrink AI software so that it can run efficiently on phones.
But that’s not far enough for Google. Now, it’s launched an even leaner version of the library called TensorFlow Lite, which is intended to help developers build ... well, lightweight AI software, for use not just in smartphones but also in embedded devices—the simple computers you might find in things like printers, fridges, thermostats, speakers, and other household gadgets.
If it works, this could be a huge inflection point for everyday AI.
Currently, most small or portable devices that use machine learning lack the horsepower to run AI algorithms. Indeed, only now are smartphones, like Apple’s iPhone X, getting dedicated hardware to help them run such software efficiently. Simpler devices still mostly have to send their problems up to the cloud using an Internet connection so that a big server can crunch the task and send it back.
That’s largely why Amazon’s Alexa, for instance, has a small but very noticeable lag when you ask it to do something. Tossing a fully functioning AI onto a device so it doesn’t need an Internet connection could make things faster, and it’s a boon for those concerned about keeping their data locked down, too. It also raises the possibility of putting simple AIs on very basic chips, which could make smart devices practically disposable.
More broadly, today’s announcement is another step in the rapid escalation of competition in the world of AI software. Microsoft and Amazon, for example, recently teamed up to build their own TensorFlow competitor called Gluon.
If the partnership sounds strange, that’s because the goal isn’t necessarily to best Google in AI software—it’s to keep the search giant lagging a distant third in the cloud business, which is a big earner for both Amazon and Microsoft. The rise of TensorFlow has given Google an edge in attracting users to its cloud, as it’s often easier to run one’s AI on the cloud servers of the company whose software library is being used. By teaming up, Microsoft and Amazon are hoping to keep as far ahead of Google as possible. The arrival of TensorFlow Lite, though, only serves to underscore that Google has no intention of being pushed around.
Deep Dive
Artificial intelligence

Meta has built a massive new language AI—and it’s giving it away for free
Facebook’s parent company is inviting researchers to pore over and pick apart the flaws in its version of GPT-3

Yann LeCun has a bold new vision for the future of AI
One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.

The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.

The hype around DeepMind’s new AI model misses what’s actually cool about it
Some worry that the chatter about these tools is doing the whole field a disservice.
Stay connected

Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.