Skip to Content
Artificial intelligence

The computing power needed to train AI is now rising seven times faster than ever before

An updated analysis from OpenAI shows how dramatically the need for computational resources has increased to reach each new AI breakthrough.
November 11, 2019
Go player Ke Jie plays a match against Google's artificial intelligence program, AlphaGo
Go player Ke Jie plays a match against Google's artificial intelligence program, AlphaGo
Go player Ke Jie plays a match against Google's artificial intelligence program, AlphaGoAP

In 2018, OpenAI found that the amount of computational power used to train the largest AI models had doubled every 3.4 months since 2012.

The San Francisco-based for-profit AI research lab has now added new data to its analysis. This shows how the post-2012 doubling compares with the historic doubling time since the beginning of the field. From 1959 to 2012, the amount of power used doubled every two years, tracking Moore’s Law. This means the resources used today are doubling at a rate seven times faster than before.

Modern Era (2012 to present day) AI compute usage on a linear scale. AlexNet to AlphaGo Zero: A 300,000x increase in compute.
OpenAI

This dramatic increase in the resources needed underscores just how costly the field’s achievements have become. Keep in mind that the above graph shows a logarithmic scale. On a linear scale (below), you can more clearly see how compute usage has increased by 300,000-fold in the last seven years.

The chart also notably does not include some of the most recent breakthroughs, including Google’s large-scale language model BERT, OpenAI’s language model GPT-2,  or DeepMind’s StarCraft II-playing model AlphaStar.

OpenAI
OpenAI

In the past year, more and more researchers have sounded the alarm on the exploding costs of deep learning. In June, an analysis from researchers at the University of Massachusetts, Amherst, showed how these increasing computational costs directly translate into carbon emissions.

In their paper, they also noted how the trend exacerbates the privatization of AI research because it undermines the ability for academic labs to compete with much more resource-rich private ones.

In response to this growing concern, several industry groups have made recommendations. The Allen Institute for Artificial Intelligence, a nonprofit research firm in Seattle, has proposed that researchers always publish the financial and computational costs of training their models along with their performance results, for example.

In its own blog, OpenAI suggested policymakers increase funding to academic researchers to bridge the resource gap between academic and industry labs.

Correction: A previous version of this article incorrectly stated the doubling time today is more than seven times the rate before. The resources used are doubling seven times faster, and the doubling time itself is one-seventh the previous time.

Deep Dive

Artificial intelligence

conceptual illustration showing various women's faces being scanned
conceptual illustration showing various women's faces being scanned

A horrifying new AI app swaps women into porn videos with a click

Deepfake researchers have long feared the day this would arrive.

Conceptual illustration of a therapy session
Conceptual illustration of a therapy session

The therapists using AI to make therapy better

Researchers are learning more about how therapy works by examining the language therapists use with clients. It could lead to more people getting better, and staying better.

a Chichuahua standing on a Great Dane
a Chichuahua standing on a Great Dane

DeepMind says its new language model can beat others 25 times its size

RETRO uses an external memory to look up passages of text on the fly, avoiding some of the costs of training a vast neural network

THE BLOB, 1958, promotional artwork
THE BLOB, 1958, promotional artwork

2021 was the year of monster AI models

GPT-3, OpenAI’s program to mimic human language,  kicked off a new trend in artificial intelligence for bigger and bigger models. How large will they get, and at what cost?

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.