Skip to Content

Illuminating Silicon

Optical devices made out of silicon could transform computing.
August 15, 2007

“We’re going to be communicating with terabits of information in the next decade,” says Mario Paniccia, an Intel fellow and director of the Photonics Technology Lab in Santa Clara, CA. A terabit of data is the capacity of roughly 35 DVDs. But today’s fastest telecommunications networks use chips that zip data around at 10 to 40 gigabits per second, and most networks use expensive, clunky components that are assembled piecemeal and achieve lesser speeds. “The ability to have an integrated chip that can transmit and receive a terabit is a compelling solution, and we’re still talking a chip the size of your fingernail,” says ­Paniccia, holding in his palm three silicon chips that could prove to be the heart of that solution–thumbnail-size squares that reflect light like mirrors.

Mario Paniccia, an Intel fellow and director of the Photonics Technology Lab in Santa Clara, CA, holds a test fixture with a modulator mounted at its center, a die holding numerous light detectors, and a gold-colored, fingernail-size square of hybrid lasers, built on a silicon substrate.

Photonic technology, which uses light to transmit data, is the key to networks with terabit-per-second speeds. But silicon, a mainstay of the electronics industry, has been largely useless for photonics because of its poor optical properties. Photonics researchers have had to rely on exotic semiconductors such as indium phosphide, which emit light easily but are expensive and hard to work with. But in 2004, ­Paniccia’s group showed that silicon could be used to make a modulator that encodes data onto a light beam at one gigabit per second. (Telecom companies are beginning to use non-­silicon-based modulators that operate at 40 gigabits per second.) Then, in 2005, the Intel researchers bumped up the speed to 10 gigabits per second and built a surprisingly good all-­silicon laser (see “Intel’s Breakthrough,” July 2005).

Intel’s goal is to build a single silicon chip that integrates a laser, modulator, and detector, so it can emit light, encode it with data, and register incoming signals. Such a chip, says Paniccia, will affect several areas of technology. It could boost Internet bandwidth, because telecom networks would have access to more and cheaper integrated chips. It could enable new types of optical cables that transfer full-length movies from computers to iPhones or other mobile Internet devices in seconds. And computers themselves would speed up if the sluggish copper wiring that shuttles data between circuits on a microchip, and between the chip and the computer’s memory, were replaced with beams of light.

In building these new optical chips, Intel plans to piggyback on existing sili­con fabrication technology such as the litho­graphic systems used to pattern tiny transistors onto chips. Paniccia says that the ability to build photonic devices on large silicon wafers, using fine-tuned lithography to carve out features, could someday make photonic devices nearly as cheap and abundant as transistors. And if Intel has its way, integrated photonic chips that use ­silicon-­based components will be on the market within the next five years.

Multimedia

  • See images of optical devices made out of silicon.

“The Intel group has essentially been debunking the myth that silicon isn’t good for photonics,” says Alan ­Willner, a professor of electrical engineering at the University of Southern California.

The Current Work
Researchers in Paniccia’s lab are spending a lot of time tweaking the designs of three key silicon-based devices. One, the silicon hybrid laser, was first demonstrated in September 2006. While the all-silicon laser announced in 2005 emits light at near-infrared wavelengths useful for medical applications, the hybrid laser operates in the infrared range used in telecommunications networks. It is this laser that Paniccia calls the “game changer” for telecom and consumer electronics applications.

To make their silicon laser produce light at the right wavelengths, the researchers needed to use a small amount of indium phosphide. The trick was to develop a glue that easily bonded the two materials together. At present, ­Paniccia’s team is trying out slight variations on the design to improve performance. For instance, to reduce power consumption, the researchers are changing the position of the metal contacts that supply electricity to the laser.

The second device the group is working on is the modulator, which enables light to carry data. When laser light enters a conventional modulator, the modulator rapidly turns it on and off, encoding the 1s and 0s of binary data onto the beam. Modulators are usually made of expensive materials, such as lithium niobate, that easily alter light passing through them if a voltage is applied. Since silicon doesn’t readily alter light, ­Paniccia had to turn to a different design, which takes advantage of the material’s ability to guide light through channels. His modula­tor uses an ­interferometer, a device that creates interference between waves of light. Light enters one end of the modulator and is split into two beams. An electrical device alters each beam’s phase–basically, knocking the two light waves out of sync. Then the beams, with their slightly altered phases, recombine. The result is a beam that flickers on and off, representing digital information.

This past July, Paniccia announced that his group had made a silicon modulator that can operate at a record-breaking 40 gigabits per second–as fast as the best modulators currently used in the telecom industry. There’s still work to be done on the design, to optimize the device’s performance. But ­Paniccia thinks mass production is viable.

The last part of the silicon-­photonics puzzle is a working detector that can receive light from a laser and modu­lator. Again, Paniccia is attempting to overcome a basic limitation of sili­con: it doesn’t absorb light very effi­ciently. He and his team have been experimentally adding atoms of germanium to silicon to change its photonic properties so that it can absorb light at telecom wavelengths. They’ve built detectors that operate at 20 giga­bits per second, but that figure is constantly improving as the researchers vary the way the germanium is added and tinker with the design of the electrical contacts. ­Paniccia expects to have a 40-­gigabit-­per-­second detector operating by the fall.

Paniccia refers to the next stage of development as the “valley of death,” because unforeseen problems can crop up as a technology moves from the lab to the market. But he and his coworkers are optimistic. ­Paniccia points to the guts of a computer that is using a combination of lasers, modulators, and detectors made of traditional opti­cal materials–each device is about the size of a deck of cards and can cost hundreds of dollars–to transfer data around the motherboard. He hopes to replace those devices with photonic chips mass-produced on the same scale as the microprocessors.

If silicon photonic chips are built into computers, says Paniccia, a lot will need to change, including fundamental functions such as the way the computer boots up and the way the microprocessor accesses memory. “No one’s looking at these problems yet, because there hasn’t been a reason to,” he says. But now that the vari­ous ele­ments of silicon photonics are becoming a reality, that might be about to change. “Silicon photonics is making us rethink a lot of things,” he says.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.