Bringing Light to Computers
Researchers at IBM recently announced a nanoscale silicon switch that can direct trillions of bits of data per second within an optical network. The switch could make it possible to incorporate the speed and bandwidth of a telecommunications network into a personal computer, say the researchers. This is an increasingly important goal for engineers as they look for the best design for future multicore machines–computers with more than one processing center.
The advance gives researchers more control over where bits are directed in an optical network smaller than a fingernail. “We’re talking about routing a terabit per second through a single switch,” says William Green, an IBM researcher who worked on the project. Such performance is comparable to what’s achieved by very large racks of mounted equipment for telecommunications fiber optics.
Today’s top-of-the-line computers come with two or four general processing cores, but within the next decade, engineers expect to build computers with tens of cores. One of the main problems with making a many-core machine is that it’s unclear how to let all the cores communicate efficiently with each other and with other components in the computer that lie off the chip, such as memory. Currently, all of this communication is conducted over metal wires that are etched into chips and circuit boards. But wires have an intrinsic resistance, which limits the speed of data. In addition, the flowing electrons can produce electrical interference and heat that can cause computation errors.
Optical devices and waveguides built into the same silicon used to make chips are promising alternatives to electronic components and metal wires. Within the past few years, there’s been a flood of activity in this field, known as silicon photonics, from IBM, Intel, Sun Microsystems, Hewlett Packard, MIT, Columbia University, and the University of Southern California, to name a few. Researchers have steadily been creating ever more efficient silicon-based devices, such as lasers, modulators that encode data onto light, detectors, and filters that clean up signals as they travel through a network. In fact, Sun Microsystems was recently awarded a $44 million contract from the U.S. Pentagon to investigate approaches for replacing metal wires with beams of light.
While there are many pieces that are necessary for intracomputer optical networks, IBM’s switch announcement is an important step toward making such a system practical. “There have been a lot of advances in silicon photonics,” says Keren Bergman, a professor of electrical engineering at Columbia University, but IBM’s switch “is very important for being able to make optical networks on chips.” Because the device routes a number of different wavelengths of light to various parts of a chip or the system, engineers don’t need to build point-to-point waveguides to each destination in a system. “This enables you to generate and route photons to multiple destinations in a more efficient way,” Bergman says.
IBM’s switch, which is described in a recent paper in Nature Photonics, is made of connected, resonating rings etched into silicon. The rings are only 200 nanometers tall–much smaller than the dimensions of optical fibers that normally carry light. When the switch is turned on, electrons are sent to a specific ring. These electrons change the way that the ring resonates, which effectively blocks light from passing through. The light bounces off the resonator and is reflected in another direction.
The design is unique for a number of reasons, explains Green. First, the switch does not filter the light based on its wavelength, unlike switches used in telecommunications networks that need to route specific types of light to specific destinations. And the more wavelengths of light that are let through an on-chip network, the more bandwidth is available.
A second distinguishing characteristic, Green notes, is that IBM’s switch is able to withstand a variation of about 30 °C, which is crucial to ensuring that the network is reliable. Within any given microprocessor, says Green, hot spots move around on the surface of the chip as a function of number crunching. If these optical interconnects are distributed all over the surface, he says, engineers need to make sure that the hot spots don’t change the properties of the devices, so that data can make it to each end of the chip unaltered. The temperature resilience of the switch, Green says, is due, in part, to allowing multiple wavelengths of light through. As the switch changes temperature, it also changes properties, which causes some wavelengths of light to be blocked. But since the switch was designed to route a broad spectrum, it can still function in an environment with a variable temperature.
Green says that it could be five to ten years before this switch finds its way into a commercial machine. IBM has already made ultrasmall optical silicon modulators, but, he says, it will take years to integrate the modulator, the switch, and other components with chip electronics.
Indeed, the promise of silicon photonics produces a new challenge: how to redesign a computer to communicate with light instead of with electrons. “How do you design an interconnected network that really exploits the optics?” asks Bergman. “You can’t follow the network design rules of electronics,” she says. “There are many things that are going to evolve dramatically as we go forward.”
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
ChatGPT is about to revolutionize the economy. We need to decide what that looks like.
New large language models will transform many jobs. Whether they will lead to widespread prosperity or not is up to us.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
GPT-4 is bigger and better than ChatGPT—but OpenAI won’t say why
We got a first look at the much-anticipated big new language model from OpenAI. But this time how it works is even more deeply under wraps.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.