Television meteorologist Paul Douglas remembers the day back in 1997 when he had the inspiration that led to the launch of his company. He had predicted on-air a rainstorm moving through Minnesota’s Twin Cities, only to be confronted off-air by a flood of e-mails from local viewers wanting to know how the storm might affect their plans for the day. “It was so frustrating,” he recalls. “What time will it start raining in my town?’ I’m driving north; will I beat the rain?’ My wedding is this afternoon; will it be rained out?’”
Douglas was helpless to answer such queries. His forecast was based largely on information from the National Weather Service, which predicts the conditions for a 12-kilometer by 12-kilometer area and whose predictions have nothing to say about how the weather varies within that area. And even if Douglas could make such localized forecasts, there was no way he could disseminate the personalized information effectively. And then it hit him: maybe he could give everyone a customized weather report. Two years later, in 1999, he founded Minneapolis-based Digital Cyclone, which predicts weather events over six-kilometer-by-six-kilometer areas and offers the information over mobile phones.
Digital Cyclone is just one of several companies taking weather forecasting to new levels of usefulness and precision. The Weather Channel in Atlanta, GA, for one, now provides eight-kilometer-resolution maps and an alert service for desktop computers. And AccuWeather in State College, PA, generates one-kilometer-resolution weather maps that are available on personal digital assistants and Internet-enabled phones. Fed by the availability of vast reservoirs of cheap computer power, new mathematical techniques for fine-tuning weather models, and high-tech observation systems, these firms are exploiting the much neglected facet of weather forecasting Douglas calls the “short game”: that is, advising people about the particular weather conditions in their individual neighborhoods or towns, not just their regions.
The move to such higher-resolution forecasts could translate into huge savings. According to John Dutton, dean emeritus of Pennsylvania State University’s College of Earth and Mineral Sciences, over $3 trillion of the nation’s annual economy is affected by weather events. Farmers, construction workers, snow removal crews, energy maintenance workers, railroad dispatchers, and truck drivers depend on accurate and precise forecasts to effectively manage their time and resources. An unexpected cool air mass on a summer day could stick a power company with millions of dollars of unused electricity. A change in wind speed could modify a farmer’s choice to spray fertilizers, which can disperse and even ruin neighboring crops in winds above 11 kilometers per hour. A minor temperature difference can determine whether a snow removal crew will lay down sand or salt. Salt is generally only effective above -7 C, and a wrong decision to use it can not only be ineffective at reducing ice but also waste thousands of dollars of taxpayers’ money.
“Whether it’s for employees or soccer moms, there’s an insatiable demand for weather information,” says Douglas. “A lot of incredibly useful information is already available, but there’s a real opportunity to filter it, make it more timely and detailed and accurate, and provide it in more useful form.”
For five decades, weather forecasting in the United States has relied on models that run on the latest computer technology at the National Weather Service’s National Center for Environmental Prediction in Camp Springs, MD. The models use more than 100 million daily measurements of temperature, moisture, air pressure, wind speed, and wind direction gathered from different locations around the world. Based on this data, forecasts are calculated on global, national, and regional scales every six hours for areas as small as 12 kilometers by 12 kilometers.
Real weather, of course, can vary quite a bit over distances as short as a few kilometers, says Craig Burfeind, a meteorologist who cofounded Digital Cyclone with Douglas. “Winter storms can have a precise line, and a few miles to either side of that line can mean the difference between rain and six inches of snow.” Burfeind notes that one day this past winter, temperatures in the southern suburbs of Minneapolis hit the low 20s C (70s F) while the northern suburbs remained around freezing-resulting in temperature differences of 6 C or more over a six-kilometer range. And Minneapolis isn’t even near a large geological feature, such as a mountain or a valley, that can affect wind speed and direction, humidity, and temperature, and create measurable differences across a small area. Even a sizeable body of water can create sharp temperature contrasts that contribute to lake-effect snow, heavy coastal fog, or unexpected thunderstorms.
The Weather Service would be happy to produce forecasts for everyone’s locale if it were practical. But increasing the resolution of the forecast grid to, say, six kilometers by six kilometers actually requires eight times as much calculation. “Our next step in that realm is to 10 kilometers, which won’t be operational until the end of 2004,” says Lauren Morone, operations officer at the National Center for Environmental Prediction.
But there is another way. In the 1990s, researchers at Pennsylvania State University began incorporating the raw data collected from the National Weather Service into their own PC-based models. “Running a model used to be a large centralized operation, like the Manhattan Project,” says Penn State climatologist Paul Knight. In contrast, he says, the new generation of PC models complete fewer calculations for a smaller area of the globe and are therefore able to produce high-resolution, localized weather forecasts that can be churned out relatively quickly.
Digital Cyclone, for one, is capitalizing on Penn State’s success. The company provides forecasts for a number of metropolitan U.S. areas, using a single PC to turn out a weather prediction for a particular city. The forecasts are twice as frequent as those coming from the National Weather Service and cover a smaller area; that is, they run every three hours and have a resolution of six kilometers within about a 120-kilometer radius of the city.
Customers of Digital Cyclone can access the information from a Web site. By keying in their locations, they can get weather maps centered on their towns, complete with radar images and projected storm tracks. But the real value of Digital Cyclone’s service, says Douglas, is that people can acquire the information from their Internet-enabled mobile phones. Later this year, those same phones will emit audible alerts sent out by the company and tailored to people’s needs. And at a time when more and more mobile phones are equipped with Global Positioning Satellite software and signal receivers that provide information about their geographical location, Digital Cyclone is developing software-expected to become available in the next few years-that would use GPS data to offer high-resolution weather-forecasting maps automatically centered on the phone’s location.
On the Business Front
Having personalized alerts beamed to your handheld device might seem the ultimate in weather awareness automation. But businesses also need detailed forecasts, and some weather providers are already integrating the information directly into the computers of their large industrial customers. A leader in this new field of “weather-enabled” operations is Meteorlogix in Minneapolis. The company uses satellite communications to relay customized weather updates directly into a client’s operational computer system; preprogrammed to work with Meteorlogix’s alerts, the computer system recognizes which operations are affected by the weather and then takes appropriate action.
As computers become faster and cheaper and weather observation instruments improve, researchers promise even higher-resolution forecasts. In two or three years, some predict, resolutions will drop down to one kilometer. Getting there, however, will be an uphill battle. The first problem is the huge increase in computing power necessary to account for the additional variables of weather on such a small scale. “The topography of the local terrain, the presence of bodies of water, vegetation, cloud formation-all this has to be taken into account,” says Joel Myers, founder and president of AccuWeather. In fact, going down to one kilometer from four kilometers requires about 16 times the number-crunching muscle.
Despite the heavy computational price, Lloyd Treinish and his colleagues at IBM’s Yorktown Heights, NY, research facility are working on one-kilometer-resolution weather forecasts. The so-called Deep Thunder project is part of a larger IBM effort known as Deep Computing, which is concerned with analyzing large amounts of data and solving complex computational problems. Like many companies working in the area of high-resolution weather forecasting, IBM foresees business opportunities in providing better prediction models to weather-sensitive companies. Treinish and his team have modified a standard regional model, developed at Colorado State University, and adapted it to the terrain, wind flow, and ocean-driven moisture patterns of the New York area. They use data received from the National Weather Service and then verify their forecasts using nearby weather stations-even one installed in Fishkill, NY. Later this year, IBM will install five more stations at its facilities in southeastern New York to help further fine-tune the models. Deep Thunder generates an updated 24-hour forecast two to four times a day, which requires almost two hours of computer time per forecast.
The modified model has proven its worth several times-as when it recognized the severity of a February 2003 blizzard some nine hours before the Weather Service. Treinish attributes the model’s accuracy to its fine resolution, which in addition to providing a more detailed look at local weather often leads to better forecasts for an entire region. “By getting at the physics behind smaller-scale weather events, you can get a better picture of what’s going on with larger-scale events like storms,” he says.
Treinish even talks of getting down to 500-meter resolution in the future, though he points out it would require modeling wind turbulence around New York City’s tall buildings-and it would demand about 100 times as much number crunching. “But that’s not an unheard-of increase in computing power, if you’re willing to wait a few years,” he says.
Such accurate, high-resolution weather forecasts depend on models that are regularly recalibrated by comparing their predictions against actual observed weather. But trying to tell whether or not forecasts are improving is harder than it sounds, because the differences can be subtle and complicated: perhaps wind speed predictions are getting a bit more accurate, while temperature predictions are getting a bit worse. David Stensrud of the National Oceanic and Atmospheric Administration’s Severe Storm Laboratory in Norman, OK, notes that improving the models is slow going. “These models are so complicated that if you can correct for one problem you can easily cause another one,” Stensrud says. “You have to run the new model over a large number of cases to check it, and that makes it a huge, labor-intensive effort.”
And of course, a forecasting model is only as good as the instruments feeding it data. Eventually, meteorologists may be able to access finely detailed weather data from vast networks of sensors spaced just tens of meters apart over many parts of the country. A group led by Deborah Estrin, a computer scientist and director of the Center for Embedded Networked Sensing at the University of California, Los Angeles, is already embedding wireless sensor networks designed to monitor microclimate data-including, eventually, carbon dioxide levels-around small patches of trees and plants. “We want to explore the relationship between monitoring weather on a regional scale and on a microscale,” she says.
Gathering such specific data may be in our future, but is it practical? “Putting out forecasts at the level of city blocks definitely makes it seem as if the forecasts are more precise,” says Craig Edwards, chief meteorologist with the Minneapolis office of the National Weather Service. “But the forecasts for one block would probably be the same as for other blocks.” There’s also a trade-off between resolution and how far into the future a model can make accurate predictions, says Young, because of how quickly small-scale weather phenomena change. “Today’s high-resolution forecasts are useful for a day or so,” he says. “We’re rapidly heading to resolutions that won’t buy you anything beyond six hours.”
Regardless of the new technology’s utility, meteorologists will be able to show off more of it in their forecasts as the public gets comfortable with weather jargon and maps. “The weather IQ of the public has increased tremendously over the last 10 years,” says AccuWeather’s Myers. “Instead of saying there’s a chance of rain today, we could say there’s a 20 percent chance of rain between 10:00 a.m. and noon, a 40 percent chance between noon and 2:00 p.m., and a 20 percent chance after that. That’s the sort of information you could use to schedule your golf game.”
|Others in High-Resolution Forecasting|
|Meteo Consult |
|Site-specific, based on location of weather stations||Weather maps and satellite and radar images delivered to Web-enabled mobile phones|
|110 meters||Text message forecasts sent to alphanumeric pagers and mobile phones|
|AWS Convergence |
|Site-specific, based on location of proprietary weather stations||Web-based application that provides forecasts for customers in the energy industry|
|Weather Services |
|Eight to 10 kilometers; point-specific forecasts||Weather maps and satellite and radar images delivered to people working in the media, energy, marine, and aviation industries (pilots can receive data in-flight on personal digital assistants)|
We won’t know how bad omicron is for another month
Gene sequencing gave an early alert about the latest covid variant. But we'll only know if omicron is a problem by watching it spread.
The US crackdown on Chinese economic espionage is a mess. We have the data to show it.
The US government’s China Initiative sought to protect national security. In the most comprehensive analysis of cases to date, MIT Technology Review reveals how far it has strayed from its goals.
Why blanket travel bans won’t work to stop omicron
The aim was to stop the variant's spread, but these bans look like too little, too late.
Eight ways scientists are unwrapping the mysteries of the human brain
Optogenetics and advanced imaging have helped neuroscientists understand how memories form and made it possible to manipulate them.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.