In the world of computers, silicon is king. The semiconducting element forms regular, near-perfect crystals into which chipmakers can carve the hundreds of millions of features that make the microchips that power the processors. Technological improvements let chipmakers cut the size of those features in half every 18 months-a feat known as Moore’s law, after Intel cofounder Gordon Moore. Today, that size hovers around 180 nanometers (180 billionths of a meter), and researchers expect to push below 50 nanometers within a decade. But that’s about as far as silicon can go: below that quantum physics makes electrons too unruly to stay inside the lines. If computers are to keep up with Moore’s law, they will have to move beyond silicon. After a couple of decades of theorizing, computer scientists, bioengineers and chemists in the mid-1990s began lab experiments seeking alternative materials for future CPUs and memory chips. Today, their research falls into three broad categories: quantum, molecular and biological computing.
In the field of quantum computing, researchers seek to harness the quantum effects that will be silicon’s undoing. Scientists succeeded in making rudimentary logic gates out of molecules, atoms and sub-atomic particles such as electrons. And incredibly, other teams have discovered ways to perform simple calculations using DNA strands or microorganisms that group and modify themselves.
Molecular Building Blocks
In one type of molecular computing (or nanocomputing), joint teams at Hewlett Packard Co. and UCLA sandwich complex organic molecules between metal electrodes coursing through a silicon substrate. The molecules orient themselves on the wires and act as switches. Another team at Rice and Yale universities has identified other molecules with similar properties.
Normally, the molecules won’t let electrons pass through to the electrodes, so a quantum property called tunneling, long used in electronics, is manipulated with an electric current to force the electrons through at the proper rate. If researchers can figure out how to lay down billions of these communicating molecules, they’ll be able to build programmable memory and CPU logic that is potentially millions of times more powerful than in today’s computers.
Molecular researchers like the HP/UCLA team, however, face a challenge in miniaturizing their current wiring technology-nanowires made from silicon strands-from several hundred to approximately 10 nanometers. Carbon nanotubes are promising substitutes. The rigid pipes make excellent conductors, but scientists must figure out how to wrangle them into the latticework needed for complex circuitry. “We’ve shown that the switching works,” says HP computer architect Philip Kuekes. “But there is still not as good an understanding of the basic mechanism so that an engineer can design with it.” Hewlett Packard and UCLA have jointly patented several techniques for manufacturing of molecular computers, most recently in January of 2002.
Although molecular circuits employ some quantum effects, a separate but related community of scientists is exploring the possibilities of quantum computing-computing with atoms and their component parts. It works from the notion that some aspect of a sub-atomic particle-say, the location of an electron’s orbit around a nucleus-can be used to represent the 1s and 0s of computers. As with molecules, these states can be manipulated-programmed, in effect.
One approach pursued by members of a national consortium involving Berkeley, Harvard, IBM, MIT and others, involves flipping the direction of a spinning electron to turn switches on or off. By applying electromagnetic radiation in a process called nuclear magnetic resonance (NMR) like that used in medical imaging, researchers can control the spin of the carbon and hydrogen nuclei in chloroform. Alternatively, filters and mirrors show promise for controlling photons’ light as a switching mechanism. Other researchers work with materials such as quantum “dots” (electrons in silicon crystal), and “ion traps” (ionized atoms suspended in an electrical field).
Quantum bits (qubits) have an unusual quality that makes them a double-edge sword for computing purposes, though. Due to the lack of determinism inherent in quantum mechanics, qubits can be on or off simultaneously, a phenomenon called superposition. This makes it harder to force qubits into digital lockstep, but it also multiplies exponentially the amount of information groups of qubits can store. It theoretically allows massively parallel computation to solve problems previously thought uncomputable, such as factoring large prime numbers. One implication: today’s encryption techniques depend on the unfeasibility of computing the two multipliers (factors) of certain numbers, so quantum computers may one day be able to crack most encrypted files that exist today. This possibility has given the research a boost from government agencies, including the National Security Agency.
To be manufacturable, quantum computers will require billions of such sub-atomic switches working together and interacting with their environments without falling into a disorganized state called decoherence. A quantum state called entanglement-where many atoms are made to behave exactly alike-provides one possible solution. Researchers also hope to fight decoherence by harnessing a phenomenon called interference, that is, the overlapping of quantum particles’ wavelike energy.
Getting Down to the Biology
In addition to molecular and quantum computing, a third approach, biological computing, relies on living mechanism to perform logic operations.
Bioengineers have long understood how to manipulate genes to function as switches that activate other genes. Now they’re using the technique to build rudimentary computer “clocks” and logic gates inside bacteria such as E. coli. Other researchers use genes to prod microorganisms into states that represent information. A team headed by Thomas Knight at the MIT Artificial Intelligence Laboratory genetically manipulates luciferase, an enzyme in luminescent creatures such as fireflies, to generate light that serves as a medium of cell-to-cell communication.
One of biological computing’s biggest challenges is calculating with elements that are flawed, unreliable and decentralized. To that end, Knight’s amorphous computing group studies ways to encourage bacteria to organize themselves into parallel-processing computers. “I don’t think of it as likely to be the path to making conventional computers,” Knight says. “It will be the way in which we build the molecular-scale computers.”
Molecular computers face similar reliability challenges. At HP, researchers used fault-tolerant algorithms to construct a silicon-based computer called Teramac that worked despite having 220,000 defects. Kuekes, Teramac’s project manager, says the company is now exploring ways to translate what they’ve learned to molecular computing.
Farther out on the biological curve is DNA computing, which attempts to exploit the way DNA strands recognize each other and combine into structures that could perform large, compute-intensive calculations in parallel.
Few in the biological community expect biocomputers to replace the general-purpose silicon computer. They hope instead to manufacture molecular computers cheaply and efficiently with organisms that can orient themselves into logic circuits or transform vats of chemicals to manufacture other chemicals.
Still more exciting possibilities come from the potential of special-purpose biological computers to interact with other biological systems. Miniature computers could be injected into living tissue to reprogram cancer-causing genes, for example, or administer insulin shots.
For now, all these applications loom distant on the horizon. But researchers agree that silicon’s days are numbered, and that radical new approaches will be needed to keep computers zooming through the 21st century.