Skip to Content
Artificial intelligence

AI researchers need to stop hiding the climate toll of their work

August 2, 2019
An image of a data center
An image of a data centerAlexander Heinl/Picture-Alliance/DPA/AP Images

The Allen Institute for Artificial Intelligence (AI2) is proposing a new way to incentivize energy-efficient machine learning.

Exploding footprint: More researchers are sounding the alarm about the growing costs of deep learning. In 2018, OpenAI published a study showing that the computational resources required to train large models was doubling every three to four months. In June, another study found that developing large-scale natural-language processing models, in particular, could produce a shocking carbon footprint.

The trend is driven by the research community’s emphasis on advancing the state of the art—with little regard to costs. While there are leaderboards that celebrate performance breakthroughs, for example, they rarely mention what those incremental improvements cost. Often, linear increases in performance are unlocked through exponential increases in resources. At this rate, one expert predicts, AI could account for as much as one-tenth of the world’s electricity use by 2025.

Rich get richer: These statistics aren’t just concerning from an environmental perspective. They also have implications on the field’s diversity and advancement. The sheer amount of resources needed to produce notable results privileges private over academic AI labs. This could restrict the field’s development to shorter-term projects that are more aligned with corporate incentives rather than longer-term advances that would benefit the public, for example.

Show your work: In a new paper, researchers at the Seattle-based AI2 have proposed a new way to mitigate this trend. They recommend that AI researchers always publish the financial and computational costs of training their models along with their performance results. The authors hope that increasing transparency into what it takes to achieve performance gains will motivate more investment in the development of efficient machine-learning algorithms.

Oren Etzioni, the CEO of AI2 and an author on the paper, also thinks that paper reviewers for publications and conferences should reward those that improve efficiency as much as accuracy. Until people standardize efficiency metrics, it will be difficult to evaluate the importance of such a contribution. “I view reporting these numbers as necessary but not sufficient,” he says.

Why now? Recent years have seen a dramatic escalation in the amount of computing power that corporate research labs are throwing at deep learning. 

But Etzioni hopes the community can be more aware of the trade-offs. Plus, investing in more efficient algorithms could wring more mileage out of available resources and produce other gains. It’s not an either-or thing, he says: “We just want to have a better balance in the field.”

Deep Dive

Artificial intelligence

What does GPT-3 “know” about me? 

Large language models are trained on troves of personal data hoovered from the internet. So I wanted to know: What does it have on me?

An AI that can design new proteins could help unlock new cures and materials 

The machine-learning tool could help researchers discover entirely new proteins not yet known to science.

Automated techniques could make it easier to develop AI

Automated machine learning promises to speed up the process of developing AI models and make the technology more accessible.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.