AI researchers need to stop hiding the climate toll of their work

The Allen Institute for Artificial Intelligence (AI2) is proposing a new way to incentivize energy-efficient machine learning.
Exploding footprint: More researchers are sounding the alarm about the growing costs of deep learning. In 2018, OpenAI published a study showing that the computational resources required to train large models was doubling every three to four months. In June, another study found that developing large-scale natural-language processing models, in particular, could produce a shocking carbon footprint.
The trend is driven by the research community’s emphasis on advancing the state of the art—with little regard to costs. While there are leaderboards that celebrate performance breakthroughs, for example, they rarely mention what those incremental improvements cost. Often, linear increases in performance are unlocked through exponential increases in resources. At this rate, one expert predicts, AI could account for as much as one-tenth of the world’s electricity use by 2025.
Rich get richer: These statistics aren’t just concerning from an environmental perspective. They also have implications on the field’s diversity and advancement. The sheer amount of resources needed to produce notable results privileges private over academic AI labs. This could restrict the field’s development to shorter-term projects that are more aligned with corporate incentives rather than longer-term advances that would benefit the public, for example.
Show your work: In a new paper, researchers at the Seattle-based AI2 have proposed a new way to mitigate this trend. They recommend that AI researchers always publish the financial and computational costs of training their models along with their performance results. The authors hope that increasing transparency into what it takes to achieve performance gains will motivate more investment in the development of efficient machine-learning algorithms.
Oren Etzioni, the CEO of AI2 and an author on the paper, also thinks that paper reviewers for publications and conferences should reward those that improve efficiency as much as accuracy. Until people standardize efficiency metrics, it will be difficult to evaluate the importance of such a contribution. “I view reporting these numbers as necessary but not sufficient,” he says.
Why now? Recent years have seen a dramatic escalation in the amount of computing power that corporate research labs are throwing at deep learning.
But Etzioni hopes the community can be more aware of the trade-offs. Plus, investing in more efficient algorithms could wring more mileage out of available resources and produce other gains. It’s not an either-or thing, he says: “We just want to have a better balance in the field.”
Deep Dive
Artificial intelligence
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
Deepfakes of Chinese influencers are livestreaming 24/7
With just a few minutes of sample video and $1,000, brands never have to stop selling their products.
AI hype is built on high test scores. Those tests are flawed.
With hopes and fears about the technology running wild, it's time to agree on what it can and can't do.
You need to talk to your kid about AI. Here are 6 things you should say.
As children start back at school this week, it’s not just ChatGPT you need to be thinking about.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.