…[Most] data handling and analysis tools [that] are used for processing the raw observational data and the results of simulation runs, and for sharing climate data with the broader scientific community…[are] built by the climate scientists themselves, who have little or no training in software engineering. As a result the quality of this software varies tremendously… some data processing tools are barely even tested.
There are only a handful of fields in which scientists write their own code–bioinformatics, mathematics and physics come to mind–so it’s a minor miracle that people who have spent their lives thinking about atmospheric physics and the paleohistoric climate record are able to produce software at all.
Which is one reason why formally-trained software engineers have a lot to offer climate science. Easterbrook’s paper, Climate Change: A Grand Software Challenge, outlines all the ways that programmers (and those who think like programmers) could use their skills in service of preserving a livable climate for generations to come.
The most obvious way programmers can help is by developing tools that can handle the massive datasets and earth-system models required to simulate a changing climate. Many of these models are run on customized and constantly-evolving supercomputer systems, which can make reproducing individual experiments almost impossible. In addition, the data are processed in so many different ways that adequate metadata (currently missing) are required to allow scientists other than the ones conducting the experiments to meaningfully engage with that data.
The video below shows an experimental run of a climate model with a resolution so high that it approaches that of a traditional weather model. (This is what scientists do when they want to show off the size of their supercomputers.)
Other areas Easterbrook would see coders tackle go beyond the actual software used to model climate, and include the difficult task of, in essence, educating other scientists, policymakers, activists, journalists, and ordinary individuals about responses to climate change, including both mitigation and adaptation.
Visualization is a powerful educational tool, but Easterbrook notes that..
…scientific simulations are often built without concern for how the results might be communicated with broader audiences, while visualizations developed for non-scientists are often built without good connections to the latest science. Research in this space will bring together the latest science with expertise in visualization and information design, to develop interactive tools for a variety of non-specialist audiences.
Easterbrook’s paper helps us imagine a future in which budding coders will keep the people of the U.S. and the planet plugged in to what’s happening to, and what’s to be done about, Earth’s climate.
It’s a worthy goal. Current climate models mostly show us eating bark and spit-roasting our pets in the blasted hellscape that’s left after we’ve nuked each other over whatever crumbs of arable land are left.
Climate model image courtesy NCAR.
A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?
Robot vacuum companies say your images are safe, but a sprawling global supply chain for data from our devices creates risk.
A startup says it’s begun releasing particles into the atmosphere, in an effort to tweak the climate
Make Sunsets is already attempting to earn revenue for geoengineering, a move likely to provoke widespread criticism.
10 Breakthrough Technologies 2023
These exclusive satellite images show that Saudi Arabia’s sci-fi megacity is well underway
Weirdly, any recent work on The Line doesn’t show up on Google Maps. But we got the images anyway.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.