…[Most] data handling and analysis tools [that] are used for processing the raw observational data and the results of simulation runs, and for sharing climate data with the broader scientific community…[are] built by the climate scientists themselves, who have little or no training in software engineering. As a result the quality of this software varies tremendously… some data processing tools are barely even tested.
There are only a handful of fields in which scientists write their own code–bioinformatics, mathematics and physics come to mind–so it’s a minor miracle that people who have spent their lives thinking about atmospheric physics and the paleohistoric climate record are able to produce software at all.
Which is one reason why formally-trained software engineers have a lot to offer climate science. Easterbrook’s paper, Climate Change: A Grand Software Challenge, outlines all the ways that programmers (and those who think like programmers) could use their skills in service of preserving a livable climate for generations to come.
The most obvious way programmers can help is by developing tools that can handle the massive datasets and earth-system models required to simulate a changing climate. Many of these models are run on customized and constantly-evolving supercomputer systems, which can make reproducing individual experiments almost impossible. In addition, the data are processed in so many different ways that adequate metadata (currently missing) are required to allow scientists other than the ones conducting the experiments to meaningfully engage with that data.
The video below shows an experimental run of a climate model with a resolution so high that it approaches that of a traditional weather model. (This is what scientists do when they want to show off the size of their supercomputers.)
Other areas Easterbrook would see coders tackle go beyond the actual software used to model climate, and include the difficult task of, in essence, educating other scientists, policymakers, activists, journalists, and ordinary individuals about responses to climate change, including both mitigation and adaptation.
Visualization is a powerful educational tool, but Easterbrook notes that..
…scientific simulations are often built without concern for how the results might be communicated with broader audiences, while visualizations developed for non-scientists are often built without good connections to the latest science. Research in this space will bring together the latest science with expertise in visualization and information design, to develop interactive tools for a variety of non-specialist audiences.
Easterbrook’s paper helps us imagine a future in which budding coders will keep the people of the U.S. and the planet plugged in to what’s happening to, and what’s to be done about, Earth’s climate.
It’s a worthy goal. Current climate models mostly show us eating bark and spit-roasting our pets in the blasted hellscape that’s left after we’ve nuked each other over whatever crumbs of arable land are left.
Climate model image courtesy NCAR.
The hype around DeepMind’s new AI model misses what’s actually cool about it
Some worry that the chatter about these tools is doing the whole field a disservice.
The walls are closing in on Clearview AI
The controversial face recognition company was just fined $10 million for scraping UK faces from the web. That might not be the end of it.
A quick guide to the most important AI law you’ve never heard of
The European Union is planning new legislation aimed at curbing the worst harms associated with artificial intelligence.
These materials were meant to revolutionize the solar industry. Why hasn’t it happened?
Perovskites are promising, but real-world conditions have held them back.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.