Skip to Content

Intel’s Breakthrough

Its new silicon laser could add decades to Moore’s Law.

Moore’s Law, which celebrated its 40th anniversary this spring, has been the semiconductor industry’s greatest blessing. In 1965, Intel cofounder Gordon Moore projected that the number of transistors on a computer chip would double every two years. At the time, a chip held just a few dozen transistors. Today, Intel’s high-end chip contains more than 1.7 billion transistors, and that number is expected to exceed 10 billion by 2012. This steady four-decade march has fueled the modern computer revolution and made Intel into a tech powerhouse.

But the ability to pack more and more transistors and other circuitry onto chips is exacerbating a host of problems that could, if they become severe enough, threaten the growth of the existing silicon-based digital economy. Just a few of the trouble areas: heat buildup, electrical currents leaking out of circuits, electrical crosstalk between neighboring wires. The latest CPUs for desktop computers, for example, consume 100 watts of power. Laptop CPUs are generally more efficient, since they’re intended to maximize battery life. But even they now consume as much as 75 watts. “It’s like putting a toaster on your lap,” says Pat Gelsinger, an Intel senior vice president. One fix that is expected to become widespread is to boost the number of transistors on a chip not by making them smaller, but simply by plunking down the same circuit pattern two or more times on the same slab of silicon. Intel released its first such “dual core” chips this spring. And Intel executives envision a future of “many core” chips, with up to a thousand processors side by side.

But there’s a rub. The copper wires that convey the stream of digital 1s and 0s into and out of a computer, and between processors in some computers, can carry only so much data so quickly. “If I double the performance [of a processor], I need to double the performance on and off the chip,” Gelsinger says. “Copper, our traditional interconnect technology, is running out of speed.”

The problem is that electrical pulses traveling through a copper wire encounter electrical resistance, which degrades the information they carry. As a result, data bits traveling through copper must be spaced far enough apart and move slowly enough that devices on the other end of the wire can pick them up. This limitation is already producing data traffic jams on local-area networks that use copper wires to connect computers. And many experts predict it will create bottlenecks for data traffic between multiple processors within individual computers. The upshot is that even if Moore’s Law continues to hold, computers will no longer be able to take advantage of the increased power it delivers, since they won’t be able to move data onto and off chips quickly enough to keep up with the processors. It’s a fundamental challenge: computers need to find a faster way to move a vast amount of data both within and between chips.

Enter the silicon laser. Optical connections can carry thousands of times more data per second than copper wires can. But ­existing optical components, which are made out of such exotic semiconductors as gallium arsenide and indium phosphide, are far too expensive for use in individual computers or even local networks. If you could make optical devices out of silicon, which is cheap and, at least for a company like Intel, easy to manufacture, that would change everything. The move to silicon optics would add a basic new capability to silicon chips: the ability to manipulate and respond to light. Companies would likely exploit that capability first by replacing copper connections with optical links in networks. But eventually, silicon photonics might also replace copper wires between processors within a single chip. Chip ­designers also envisioned using silicon optics in the ­internal clocks that microprocessors use to execute instructions, dramatically increasing clock speeds–and thus computing speeds.

Until recently, all that speculation about the potential of silicon optics was hypothetical: suitable silicon lasers didn’t exist. But things changed last winter when the lab of Intel scientist Mario Paniccia reported the first continuous all-silicon laser. Built using the same manufacturing methods that produce silicon chips, the experimental device turned out a steady stream of infrared photons, an achievement that many researchers had believed impossible in silicon.

It’s still early days for silicon photonics. But the Intel result, which built on findings reported over the past year in a flurry of papers describing advances in silicon-based optical components, is convincing many experts that it could become practical to closely link optical and electronic technology at the computer level. The progress made by Paniccia’s team has been remarkable, says Graham Reed, a silicon-photonics pioneer at the University of Surrey in England. “Now all of the skeptics are starting to believe that silicon will have a real impact on optics.”

Anticipated advances in silicon technology will almost certainly keep Moore’s Law going for the foreseeable future, creating ever faster computers. By speeding immense amounts of data into and out of chips and between machines, silicon photonics could help people access this vast computational power.

Lousy Emitter
Optical fibers constitute the backbones of long-distance telecommunications networks and are largely responsible for the speed of the Internet. But optical components don’t come cheap. Optically sending and receiving data requires a laser that creates a light beam; a “modulator” that chops that beam into on/off bursts that represent digital 1s and 0s; “waveguides” that pipe the light through chips; and photodetectors that capture the light and convert it back into an electronic signal. Currently, these devices are not made out of silicon and cost thousands of dollars to put into place. Telecom providers can afford those prices, but making the technology feasible for moving data within a computer means reducing prices by orders of magnitude.

Silicon may be the answer. “Silicon to us, it’s maybe not a religious experience, but it’s pretty close,” Gelsinger says. “Silicon has proven cost effective, scalable, durable, manufacturable and has all sorts of other wonderful characteristics.” Photonic parts made of silicon would make optics more affordable and broaden potential uses. “Today, optics is a niche technology. Tomorrow it’s the mainstream of every chip that we build,” Gelsinger says.

Until about a year ago, it looked as if silicon would never play a significant role in optics. “Silicon is not intrinsically the best optical material,” explains Reed. Among its most obvious deficits is that it’s a lousy light-emitter. When the electrons in silicon are excited, instead of releasing photons they cause the silicon’s crystal lattice to vibrate. The result is heat, not light. By contrast, semiconductors such as gallium arsenide and indium phosphide emit light when electrically excited. So while researchers have been fascinated by the prospects of an “optical chip” for years, the consensus was that silicon was not the right material to build it with.

Then, in the late 1990s, researchers reported a series of encouraging, albeit preliminary, advances in silicon optics (see “Upstream,” Technology Review, June 2001). At Intel, the progress made by Paniccia’s team convinced executives to ramp up the company’s silicon-photonics program. Intel’s first breakthrough came in February 2004, when Paniccia reported in the journal Nature that his group had made a silicon modulator capable of converting a steady stream of light from a laser into rapid pulses of digital 1s and 0s at a rate of one billion hertz, or one gigahertz, a 50-fold advance over the previous experimentally demonstrated record for silicon. “But it still wasn’t anywhere near fast enough,” Reed says. Then this spring, Intel researchers led by materials scientist Ling Liao reported a silicon modulator that runs at 10 gigahertz, roughly on par with other optical modulators.

But the crucial silicon-photonic component was still the laser. Last September, four separate groups, including Paniccia’s, reported silicon lasers that fire staccato pulses of light. Because silicon does a poor job of converting electrical charges into light, all these silicon lasers relied on external lasers as energy sources. Like all chip-based lasers, the silicon lasers work by converting energy – in this case, photons from another light source – into a burst of photons with essentially the same wavelength and phase. The Intel researchers exploited a long-known principle called the Raman effect, in which photons gain energy from collisions with vibrating atoms.

Pulsed lasers aren’t great for transmitting data, though. Optics engineers much prefer continuous lasers, which they can slice and dice with modulators to create data signals. But all of the groups struggled with the same problem. As they increased the amount of continuous laser light they fed into the silicon chips, the likelihood that pairs of incoming photons would strike a single silicon atom at the same time also increased. When that happened, the silicon atoms kicked electrons out of their atomic orbits, and those mobile charges voraciously gobbled up photons. The incoming laser had to be pulsed to give the electrons the millionths of seconds they needed to give up their excess energy and relax back to their resting states.

Paniccia’s team came up with an answer that was both brilliant and, for those familiar with silicon technology, conceptually simple. Etched into the Intel laser chip was a silicon waveguide channel in which light bounced back and forth, gaining in intensity. The researchers implanted electrodes on both sides of the channel. When they turned on a voltage between the electrodes, it created an electric field that herded the negatively charged electrons toward the positively charged electrode, effectively sweeping them out of the way. As a result, the photons were able to build up unhampered, until they produced a continuous laser beam.

Last winter, three days before Christmas, Paniccia’s colleagues Haisheng Rong and Richard Jones saw the first sign that the strategy was working: a line on the display of an optical- spectrum analyzer showing that the infrared photons produced by the laser were coming out in a steady stream.

On the Inside
Intel researchers still have to find ways to manufacture silicon lasers alongside electronic components on chips. Electronic circuits are built through the painstaking process of laying down and etching dozens of layers of materials. Some of these steps require temperatures well over 1,000 °C or exposure to caustic chemicals. So Intel’s engineers will need to ensure that the steps required to build up the optical devices don’t degrade the electronic circuitry, and vice versa.

As an initial demonstration of the usefulness of silicon photonics, Paniccia plans later this year to integrate several modulators and other optical components onto a piece of silicon; this setup should enable data transfer speeds of 100 gigabits per second. Such a prototype, hopes Paniccia, will illustrate the potential of silicon photonics to carry data into and out of chips far more efficiently than anything currently on the market.

Walking through one of his newly renovated labs this spring, Paniccia showed off a mock-up of an optical Ethernet cable that would use silicon photonics. While Paniccia normally maintains the modest, careful demeanor of a scientist, it’s clear he relishes using the prop to sell his vision of silicon’s new role. On the end of the spaghetti-strand-thin cable sits a connector that resembles the end of a phone cord, with metal pads sitting under tiny slits in a silicon encasement. In a functional version of the cable, electrical signals would travel from a computer chip through those metal pads to a silicon photonic chip inside the tiny connector, where they would be converted into a stream of light pulses.

While on the outside the cable resembles familiar technology, adding cheap silicon photonics to it would bring unprecedented speed and power to computers. And it would allow Intel to add its famous “Intel inside” branding logo to yet another transforming technology. Realizing that vision won’t be easy. Still, Paniccia is convinced it will happen. “There is no question anymore whether we can do this. It’s when and how. That’s been the change in the last year.” And when the last technical barrier falls, he says, “silicon photonics will be everywhere.”

Robert Service is a Portland, OR-based writer who covers chemistry and materials science for Science.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.