Computing at the Speed of Light
Replacing metal wiring with fiber optics could change everything from supercomputers to laptops.
The world of computing could change rapidly in coming years thanks to technology that replaces the metal wiring between components with faster, more efficient fiber-optic links.
“All communications over long distance are driven by lasers, but you’ve never had it inside devices,” says Mario Paniccia, director of Intel’s photonics lab in Santa Clara, CA. “Our new integrated optical link makes that possible.”
Paniccia’s team has perfected tiny silicon chips capable of encoding and decoding laser signals sent via fiber optics. Today, when data arrives at a computer via a fiber optic connection it has to be moved from a separate photonic device to an electronic circuit. This new system promises to speed things up because everything works in silicon.
Last week, Paniccia’s team demonstrated the first complete photonic communications system made from components fully integrated into silicon chips. Electronic data piped into one chip is converted into laser light that travels down an optical fiber and is transferred back into electrical signals a few fractions of a second later. The system can carry data at a rate of 50 gigabits per second, enough to transfer a full-length HD movie in less than a second.
The silicon photonic chips could replace the electronic connections between a computer’s key components, such as its processors and memory. Copper wiring used today can carry data signals at little more than 10 gigabits per second. That means critical components like the central processing unit and the memory in a server cannot be too far apart, which restricts how computers can be built.
The new Intel setup has four lasers built into its transmitter chip that shine data into a single optical fiber at slightly different wavelengths, or “colors.” Chips with even more lasers should make it possible to communicate at 1,000 gigabits per second.
“Having a chip the size of your fingernail that can deliver a terabit per second changes the way you can think about design,” says Paniccia. Such chips could make a big difference inside the sprawling data centers operated at great expense by Web giants like Google, Microsoft, and Facebook. “Data centers today are big piles of copper–that imposes the limits on how you arrange components inside a server,” Paniccia says.
“If I could just move the memory a foot away [from the processors], I could add a whole board of memory for a single CPU instead,” says Paniccia, whose team is experimenting with prototype servers to work out how to build them with photonics links inside.
Moving a server’s memory away from the CPUs would also make ventilating them easier. Since roughly half the cost of running a data center, used for everything from services like Facebook to banking records, comes from cooling, that could have a significant impact.
Further savings may come from the fact that optical links require less power to operate, says Keren Bergman, who leads a silicon photonics research group at Columbia University. “With electrical wires, the longer you go, the more energy you spend in an exponential fashion,” she says. Optical fiber allows low-power signals to travel farther faster. Bergman’s group has used data on the performance of computers at Lawrence Berkeley and MIT’s Lincoln Laboratories to simulate how systems with optical interconnects might perform. “You can get an order of magnitude gain in energy efficiency,” she says, with the largest gains seen for applications such as high-bandwidth image processing and video streaming, she says.
Data centers aren’t the only things that may see their insides lit up with lasers. “We’ve developed this technology to be low-cost so we can take it everywhere, not just into high-performance computing or the data center,” says Paniccia. The components of the Intel system, including the lasers, are made with the same silicon-sculpting methods used to construct computer chips in vast quantities. “I’m drafting Moore’s law,” says Paniccia. “We’ve enabled the benefits of using light with the low-cost, high-volume, scalability of silicon.”In consumer computers like laptops, that would allow innovations in industrial design. I could put the memory in the display instead, and change the design of the whole thing.”
This could make it easier to swap in new components without having to open up a machine. It would also allow core components to be installed in peripherals. Extra memory could, for example, be hidden in a laptop or smart phone dock to increase a portable device’s computing power when plugged in.
Fully exploiting the benefits of the optical age will, however, means changes to the components being linked up. “It’s not just a case of whip out the electrical wires and replace them with optical fiber,” says Bergman.
Ajay Joshi, an assistant professor at Boston University, who is also exploring design options for high-performance computers with optical interconnects, agrees. “If we speed up the channel between logic [processors] and memory, we need to rethink the way you design that memory.”
The speed gap between processors and optical links is smaller, but ultimately, that too will likely change. “It would be nice to also see processors that work optically instead of electronically,” Joshi says.
AI is here. Will you lead or follow?
Join us at EmTech Digital 2019.