Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

More than a dozen wildfires continue to rage in Southern California, where the hot, gusty winds are spreading embers in all directions and making it difficult for firefighters to put out the flames. The United States Forest Service deployed manned aircrafts over the fires to map their locations, find hot spots, and determine the directions in which the blazes are spreading. But weather conditions have grounded many planes.

Now, the National Interagency Fire Center has called on NASA to use its unmanned aerial vehicle equipped with a new thermal-imaging sensor to help track the fires.

The sensor is much more sensitive in the thermal range than are the line scanners that are normally used to map fires. The new sensor can also track a fire with greater accuracy, says Everett Hinkley, the National Remote Sensing Program manager at the U.S. Forest Service and a principal investigator on the project to test and develop the sensor. (Technology Review previously reported on the project and technology behind the sensor. See “Mapping Wildfires.”)

NASA’s unmanned aerial vehicle, Ikhana, began its first 10-hour mission on October 24 and will continue missions for three days. In its first mission, Ikhana flew over seven different fires. The aircraft lingered over each fire for about half an hour and then repeated the drill in order to monitor how each fire was progressing, says Vince Ambrosia, an engineer at NASA Ames Research Center and the principal investigator of the thermal-imaging sensor. Both Hinkley and Ambrosia are stationed at the fire center located in Boise, ID, overseeing the use of the sensor for the California fires.

The images are processed onboard the aircraft, and the data is sent in real time to a ground station where it is incorporated into a Google Earth map. (See images below.) Capturing the images in real time is a major advance. Previously, images captured by a sensor had to be put on a “thumb drive” and dropped out of the aircraft through a tube as it flew near a command station, or the aircraft had to land so that the data could be given to a colleague to perform the analysis.

“These particular fires are very dynamic, fast-moving fires, so having very frequent updates … will be a big help,” says Hinkley.

On the morning of October 24, NASA began mapping the wildfires in California using a new thermal-imaging sensor onboard an unmanned aerial vehicle. This image is of the Harris fire in San Diego County, on the California-Mexico border. The image was taken using the 12-channel spectral sensor, and the data was automatically processed onboard the aircraft, then sent to ground stations where it was incorporated into a Google Earth map. The data is displayed in an array of colors: the fire’s hot, active spots are yellow; warm areas that were recently burned are shades of red; and areas that are cooling are blue.
Credit: NASA

1 comment. Share your thoughts »

Credit: NASA

Tagged: Computing, NASA, imaging, mapping

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me