Skip to Content

Tracking the California Fires

NASA’s thermal-imaging sensor helps fight the flames.
October 25, 2007

More than a dozen wildfires continue to rage in Southern California, where the hot, gusty winds are spreading embers in all directions and making it difficult for firefighters to put out the flames. The United States Forest Service deployed manned aircrafts over the fires to map their locations, find hot spots, and determine the directions in which the blazes are spreading. But weather conditions have grounded many planes.

Fire tracking: On the morning of October 24, NASA began mapping the wildfires in California using a new thermal-imaging sensor onboard an unmanned aerial vehicle. This image is a 3-D perspective of the Harris fire in San Diego County. The fire can be seen burning on both sides of the ridge. The data is displayed in an array of colors: the fire’s hot, active spots are yellow; warm areas that were recently burned are shades of red; and areas that are cooling are blue.

Now, the National Interagency Fire Center has called on NASA to use its unmanned aerial vehicle equipped with a new thermal-imaging sensor to help track the fires.

The sensor is much more sensitive in the thermal range than are the line scanners that are normally used to map fires. The new sensor can also track a fire with greater accuracy, says Everett Hinkley, the National Remote Sensing Program manager at the U.S. Forest Service and a principal investigator on the project to test and develop the sensor. (Technology Review previously reported on the project and technology behind the sensor. See “Mapping Wildfires.”)

NASA’s unmanned aerial vehicle, Ikhana, began its first 10-hour mission on October 24 and will continue missions for three days. In its first mission, Ikhana flew over seven different fires. The aircraft lingered over each fire for about half an hour and then repeated the drill in order to monitor how each fire was progressing, says Vince Ambrosia, an engineer at NASA Ames Research Center and the principal investigator of the thermal-imaging sensor. Both Hinkley and Ambrosia are stationed at the fire center located in Boise, ID, overseeing the use of the sensor for the California fires.

Multimedia

  • View images of the California fires that were taken using the sensor.

The images are processed onboard the aircraft, and the data is sent in real time to a ground station where it is incorporated into a Google Earth map. (See images below.) Capturing the images in real time is a major advance. Previously, images captured by a sensor had to be put on a “thumb drive” and dropped out of the aircraft through a tube as it flew near a command station, or the aircraft had to land so that the data could be given to a colleague to perform the analysis.

“These particular fires are very dynamic, fast-moving fires, so having very frequent updates … will be a big help,” says Hinkley.

On the morning of October 24, NASA began mapping the wildfires in California using a new thermal-imaging sensor onboard an unmanned aerial vehicle. This image is of the Harris fire in San Diego County, on the California-Mexico border. The image was taken using the 12-channel spectral sensor, and the data was automatically processed onboard the aircraft, then sent to ground stations where it was incorporated into a Google Earth map. The data is displayed in an array of colors: the fire’s hot, active spots are yellow; warm areas that were recently burned are shades of red; and areas that are cooling are blue.
Credit: NASA

This image, also obtained using NASA’s new thermal-imaging sensor, is a 3-D perspective of the Harris fire. The fire can be seen burning on both sides of the ridge. The data is once again displayed in an array of colors: the fire’s hot, active spots are yellow; warm areas that were recently burned are shades of red; and areas that are cooling are blue.
Credit: NASA

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.