Last week, researchers at the University of California, Davis, overlaid FEMA’s flood-zone maps on top of satellite imagery of the devastating flooding around Houston after Harvey poured more than 40 inches of rain across the region.
The preliminary assessment found that two-thirds of the inundation occurred outside the federal agency’s 100-year floodplains, where there should be only a 1 percent chance of flooding in any given year. More than half of the deluge happened “outside of any mapped flood zone,” even including 500-year events, in areas that should face only “minimal flood hazard” (see “How Much Is Climate Change to Blame for Tropical Storm Harvey?”).
This, in part, underscores the rare severity of the storm that hovered over the Texas coastline for days. But it also arguably highlights inadequacies in our federal flood risk assessments, since by some calculations Harvey “represents the third ‘500-year’ flood in the Houston area in the past three years,” as the UC Davis researchers note.
That “basically refutes suggestions that Houston has just suffered from random ‘bad luck,’” said Nicholas Pinter, associate director of the UC Davis Center for Watershed Sciences, in an e-mail. “We scientists are ultra-cautious about reading climate change in any single weather event, and that caution is appropriate. But there is a growing suspicion that the U.S. may be creeping over a meteorological tipping point.”
The crucial problem is that flood-zone maps are based on historical patterns that are increasingly divorced from the current dangers under changing climate conditions. That, in turn, means that planning policies, building codes, insurance programs, and building patterns based on these assessments can often be dangerously out of date as well. In many cases, we’re constructing cities and flood protections based on the climate of the past rather than the conditions of the future—or even the present. That’s subjecting citizens to ever greater dangers, and society to far higher costs for disaster relief and reconstruction in the aftermath of events like Harvey or, as is looking increasingly likely, Hurricane Irma.
Some scientists have been sounding this warning for years, arguing that flooding and storm risk analysis need to move beyond the “stationary” approach, which assumes that the statistical distribution of events in the past will remain constant moving forward.
“We can’t extrapolate the past into the future because of changes going on in the system,” says Paul Milly, research hydrologist at the U.S. Geological Survey, and lead author of a 2008 Science paper titled “Stationarity Is Dead: Whither Water Management?” “Climate change needs to be considered as a possible factor in the changing risks of floods and other hazardous events,” he says.
Among other factors, warmer air holds more moisture, and higher sea levels increase the height of storm surges, all of which can amplify the magnitude and destructive capacity of storms.
Progress toward new methodologies, however, has been slow and uneven, in part because of political complexities—and in part because it’s a challenging science. The climate system is highly complex, our knowledge is incomplete, and projection models generally include broad ranges of potential impacts, dependent on future greenhouse-gas emissions, environmental tipping points, and other factors.
But some scientists are certainly trying to update our understanding of the growing dangers from climate change. Kerry Emanuel, a hurricane researcher and professor of atmospheric science at MIT, recently evaluated the future risk of hurricane rainfall in Boston—and found a stark shift in threat levels as climate change increases the frequency of storms and amount of rain per storm.
A 100-year hurricane rainfall event before 2000 in Boston could become nearly a one-in-10-year occurrence by 2081, meaning it would have around a 10 percent chance of happening in any given year, he found. Likewise, a previously 1,000-year event in the region could become closer to a 50-year occurrence.
In a paper published earlier this year, Emanuel wrote that limited aircraft data for near-coastal Atlantic storms, as well as the need to incorporate projected climate change, required the use of simulated storms. To those, he applied a broad range of climate models, from NOAA, the Met Office Hadley Centre, the Max Planck Institute for Meteorology, and other institutions. These models were designed to “simulate the response of both winds and thermodynamic conditions to changing climate.”
In general, the research shows a substantial increase in the number of storms that could intensify just before landfall by 2100. But even if the broad direction is clear, Emanuel noted, it will be difficult to accurately forecast that late shift for any given storm as it approaches, requiring further improvements to hurricane forecasting.
A few cities, and some engineering firms, have already begun to adopt development standards that incorporate future climate- change threats. Notably, in the aftermath of Hurricane Sandy, the New York City Department of Environmental Protection conducted a comprehensive assessment, and concluded that some $1 billion in assets were under threat from future sea-level rise and storm surges. The analysis added 30 inches of flooding on top of FEMA’s 100-year flood maps, adopting the high-end forecast from the New York City Panel on Climate Change, and ultimately recommended $315 million in facility upgrades.
Similarly, in 2015, President Obama issued an executive order that established new flood standards for federally funded projects that took into account the rising risks of climate change. It required agencies to either build two or three feet above 100-year flood lines, depending on the project type; base new development on 500-year flood elevations; or otherwise determine appropriate construction standards based on the best available climate science.
Less than two weeks before Hurricane Harvey made landfall, President Trump rescinded that order.
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Driving companywide efficiencies with AI
Advanced AI and ML capabilities revolutionize how administrative and operations tasks are done.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.