MIT Technology Review Subscribe

AI researchers need to stop hiding the climate toll of their work

The Allen Institute for Artificial Intelligence (AI2) is proposing a new way to incentivize energy-efficient machine learning.

Exploding footprint: More researchers are sounding the alarm about the growing costs of deep learning. In 2018, OpenAI published a study showing that the computational resources required to train large models was doubling every three to four months. In June, another study found that developing large-scale natural-language processing models, in particular, could produce a shocking carbon footprint.

Advertisement

The trend is driven by the research community’s emphasis on advancing the state of the art—with little regard to costs. While there are leaderboards that celebrate performance breakthroughs, for example, they rarely mention what those incremental improvements cost. Often, linear increases in performance are unlocked through exponential increases in resources. At this rate, one expert predicts, AI could account for as much as one-tenth of the world’s electricity use by 2025.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

Rich get richer: These statistics aren’t just concerning from an environmental perspective. They also have implications on the field’s diversity and advancement. The sheer amount of resources needed to produce notable results privileges private over academic AI labs. This could restrict the field’s development to shorter-term projects that are more aligned with corporate incentives rather than longer-term advances that would benefit the public, for example.

Show your work: In a new paper, researchers at the Seattle-based AI2 have proposed a new way to mitigate this trend. They recommend that AI researchers always publish the financial and computational costs of training their models along with their performance results. The authors hope that increasing transparency into what it takes to achieve performance gains will motivate more investment in the development of efficient machine-learning algorithms.

Oren Etzioni, the CEO of AI2 and an author on the paper, also thinks that paper reviewers for publications and conferences should reward those that improve efficiency as much as accuracy. Until people standardize efficiency metrics, it will be difficult to evaluate the importance of such a contribution. “I view reporting these numbers as necessary but not sufficient,” he says.

Why now? Recent years have seen a dramatic escalation in the amount of computing power that corporate research labs are throwing at deep learning. 

But Etzioni hopes the community can be more aware of the trade-offs. Plus, investing in more efficient algorithms could wring more mileage out of available resources and produce other gains. It’s not an either-or thing, he says: “We just want to have a better balance in the field.”

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement