Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

There are two ways to get around the memory wall: the first is to increase the performance of a system’s memory, and the second is simply to slow down its CPU. FAWN does both: flash memory has much faster random access than disk-based storage, and FAWN’s slower processors require less power and waste fewer transistors trying to guess what’s coming next.

FAWN is composed of many individual nodes, each with a single 500-megahertz AMD Geode processor (the same chip used in the first One Laptop Per Child $100 laptop) with 256 megabytes of RAM and a single four-gigabyte compact flash card. The largest FAWN cluster built to date, consisting of 21 nodes, draws a maximum of 85 watts under real-world conditions.

Each FAWN node performs 364 queries per second per watt, which is a hundred times better than can be accomplished by a traditional disk-based system working on an input/output-intensive task, such as gathering all the disparate bits of information required to display a Facebook or FriendFeed page or a Google search result.

This kind of performance may have applications beyond the data center, says Steven Swanson, an assistant professor in the department of computer science and engineering at the University of California, San Diego. Swanson’s own high-performance, flash-memory-based server, called Gordon, which currently exists only as a simulation, is similar to FAWN in its architecture but was designed with scientific applications as well as data centers in mind.

Swanson’s goal is to exploit the unique qualities of flash memory to handle problems that are currently impossible to address with anything other than the most powerful and expensive supercomputers on earth–systems with up to a petabyte of RAM. “We work with the San Diego Supercomputing Center on large genomics and bioinformatics patterns,” says Swanson. “We want to do queries very quickly, and if the data graphs won’t fit in RAM, they get very slow, which means you have to give up fidelity in the simulation.”

FAWN is “the right direction to push,” says Niraj Tolia, a researcher in the Exascale Computing Lab at HP Labs. “The days are gone when we simply looked at raw performance as a metric,” he adds.

Currently, FAWN is not suitable for CPU-intensive tasks such as processing video, but Andersen says that future iterations will use the more powerful Atom processors (which Swanson is also contemplating for his Gordon system). Having been designed for netbooks, these more powerful processors draw the same amount of power as the AMD chips–about four watts each. Throw in a power supply and some networking equipment, and “you could very easily run a small website on one of these servers, and it would draw 10 watts,” says Andersen–a tenth of what a typical Web server draws.

The next generation of FAWN is something that Andersen hopes the largest users of data centers will investigate. “I would love it if we could get Facebook or Google or Microsoft to start building clusters with this,” he says.

0 comments about this story. Start the discussion »

Credit: David Anderson

Tagged: Computing, Communications, cloud computing, data storage, processors, data centers power, power management, computational infrastructure, architecture

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me