Skip to Content

More efficient machine learning could upend the AI paradigm

Smaller algorithms that don’t need mountains of data to train are coming.
February 2, 2018
Yaopai

In January, Google launched a new service called Cloud AutoML, which can automate some tricky aspects of designing machine-learning software. While working on this project, the company’s researchers sometimes needed to run as many as 800 graphics chips in unison to train their powerful algorithms.

Unlike humans, who can recognize coffee cups from seeing one or two examples, AI networks based on simulated neurons need to see tens of thousands of examples in order to identify an object. Imagine trying to learn to recognize every item in your environment that way, and you begin to understand why AI software requires so much computing power.

If researchers could design neural networks that could be trained to do certain tasks using only a handful of examples, it would “upend the whole paradigm,” Charles Bergan, vice president of engineering at Qualcomm, told the crowd at MIT Technology Review’s EmTech China conference earlier this week.

If neural networks were to become capable of “one-shot learning,” Bergan said, the cumbersome process of feeding reams of data into algorithms to train them would be rendered obsolete. This could have serious consequences for the hardware industry, as both existing tech giants and startups are currently focused on developing more powerful processors designed to run today’s data-intensive AI algorithms.

It would also mean vastly more efficient machine learning. While neural networks that can be trained using small data sets are not a reality yet, research is already being done on making algorithms smaller without losing accuracy, Bill Dally, chief scientist at Nvidia, said at the conference.

Nvidia researchers use a process called network pruning to to make a neural network smaller and more efficient to run by removing the neurons that do no contribute directly to output. “There are ways of training that can reduce the complexity of training by huge amounts,” Dally said.

Keep Reading

Most Popular

The miracle molecule that could treat brain injuries and boost your fading memory

Discovered more than a decade ago, a remarkable compound shows promise in treating everything from Alzheimer’s to brain injuries—and it just might improve your cognitive abilities.

wet market selling fish
wet market selling fish

This scientist now believes covid started in Wuhan’s wet market. Here’s why.

How a veteran virologist found fresh evidence to back up the theory that covid jumped from animals to humans in a notorious Chinese market—rather than emerged from a lab leak.

Conceptual illustration showing a file folder with the China flag and various papers flying out of it
Conceptual illustration showing a file folder with the China flag and various papers flying out of it

The US crackdown on Chinese economic espionage is a mess. We have the data to show it.

The US government’s China Initiative sought to protect national security. In the most comprehensive analysis of cases to date, MIT Technology Review reveals how far it has strayed from its goals.

conceptual illustration showing various women's faces being scanned
conceptual illustration showing various women's faces being scanned

A horrifying new AI app swaps women into porn videos with a click

Deepfake researchers have long feared the day this would arrive.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.