Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

The achievement may not rank up there with Samuel Morse’s transmitting “What hath God wrought” from Washington, DC, to Baltimore in 1844 or Alexander Graham Bell’s voice intoning, “Watson, come here. I want you” from one room to another in 1876. Nevertheless, scientists may eventually mark as a milestone the day in 2001 when Isaac Chuang and his colleagues at IBM determined that the two prime factors of the number 15 are three and five.

What made their calculation remarkable, of course, wasn’t the grammar school arithmetic, but that the calculation had been performed by seven atomic nuclei in a custom-designed fluorocarbon molecule. The irony that an experiment so complex and delicate would yield a result so pedestrian and mundane is not lost on Chuang, one of the world’s most prominent researchers in quantum computing. “My group,” he says with a chuckle, “holds the world record for the largest and most useless quantum computer.”

But Chuang, now an associate professor at the MIT Media Lab, might be showing an excess of humility. Quantum computers exist today only on a painfully small scale. But despite a slow start, the field seems to be on the verge of yielding real advances in quantum theory and engineering. Researchers have proposed the first designs for large-scale quantum computers, devices that use the bizarre properties of subatomic particles existing at the extremes of small size and high speed to solve problems that confound even the most powerful conventional computing devices.

One engineering approach with considerable promise employs a class of devices that can trap individual electrons within an electromagnetic field. Their “spin,” or orientation in a magnetic field, can be observed to produce a quantum bit, or “qubit.” Another promising approach uses nuclear magnetic resonance, which can manipulate collections, or “ensembles,” of molecules to perform computations and return results in a measurable form. This is the technique Chuang and Neil Gershenfeld, a fellow MIT Media Lab professor, are exploring. “People are coming up with all these tools that will make a quantum computer easier to make,” says Jonathan Dowling, principal scientist and supervisor of the quantum-computing technologies group at NASA’s Jet Propulsion Laboratory in Pasadena, CA.

This research is also starting to indicate near-term spinoffs, including improvements in electronic controls for navigational, communications, and measurement devices. “One of the things I do is think of ways to use quantum computing to make better gizmos,” says Dowling. Among his projects is a quantum gyroscope that would exploit the quantum behavior of photons to make these crucial navigation devices more sensitive. The excitement has spread even to the artificial-intelligence community: there are signs that the ability of quantum algorithms to probe multiple possibilities simultaneously might help in the mining of large-scale databases, one of the field’s most important practical goals. If the geographic positioning systems, mobile phones, search engines, and integrated circuits of the future are vastly more precise or reliable than today’s, it may be the result of trailblazing quantum-computing efforts under way right now in labs around the world.

The electronics industry, naturally, has taken notice. IBM sponsors quantum-computing research at its Almaden Research Center on the outskirts of Silicon Valley, where Chuang performed his initial work, as well as at its flagship Thomas J. Watson Research Center in Yorktown Heights, NY. Hewlett-Packard supports quantum-computing research at its labs in Palo Alto, CA, and Bristol, England. And the semiconductor industry, which has come of age along with advances in the electronics of classical computing, is keeping a sharp eye on developments in the field, the better to be prepared for the post-Moore’s Law era, when miniaturization of classical electronic circuits bumps against physical limits. Experts predict that will happen sometime within the next two decades.

“Progress has been slow, but it’s been steady,” says David P. DiVincenzo of IBM’s Watson lab. “Two or three years ago some significant efforts were started that are beginning to pay off.”

Pages

0 comments about this story. Start the discussion »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me
×

A Place of Inspiration

Understand the technologies that are changing business and driving the new global economy.

September 23-25, 2014
Register »