Neural networks are taking the world of computing by storm. Researchers have used them to create machines that are learning a huge range of skills that had previously been the unique preserve of humans—object recognition, face recognition, natural language processing, machine translation. All these skills, and more, are now becoming routine for machines.
So there is great interest in creating more capable neural networks that can push the boundaries of artificial intelligence even further. The focus of this work is in creating circuits that operate more like neurons, so-called neuromorphic chips. But how to make these circuits significantly faster?
Today, we get an answer of sorts thanks to the work of Alexander Tait and pals at Princeton University in New Jersey. These guys have built an integrated silicon photonic neuromorphic chip and show that it computes at ultrafast speeds.
Optical computing has long been the great dream of computer science. Photons have significantly more bandwidth than electrons and so can process more data more quickly. But the advantages of optical data processing systems have never outweighed the additional cost of making them, and so they have never been widely adopted.
That has started to change in some areas of computing, such as analog signal processing, which requires the kind of ultrafast data processing that only photonic chips can provide.
Now neural networks are opening up a new opportunity for photonics. “Photonic neural networks leveraging silicon photonic platforms could access new regimes of ultrafast information processing for radio, control, and scientific computing,” say Tait and co.
At the heart of the challenge is to produce an optical device in which each node has the same response characteristics as a neuron. The nodes take the form of tiny circular waveguides carved into a silicon substrate in which light can circulate. When released this light then modulates the output of a laser working at threshold, a regime in which small changes in the incoming light have a dramatic impact on the laser’s output.
Crucially, each node in the system works with a specific wavelength of light—a technique known as wave division multiplexing. The light from all the nodes can be summed by total power detection before being fed into the laser. And the laser output is fed back into the nodes to create a feedback circuit with a non-linear character.
An important question is just how closely this non-linearity mimics neural behavior. Tait and co measure the output and show that it is mathematically equivalent to a device known as a continuous-time recurrent neural network. “This result suggests that programming tools for CTRNNs could be applied to larger silicon photonic neural networks,” they say.
That’s an important result because it means the device that Tait and co have made can immediately exploit the vast range of programming nous that has been gathered for these kinds of neural networks.
They go on to demonstrate how this can be done using a network consisting of 49 photonic nodes. They use this photonic neural network to solve the mathematical problem of emulating a certain kind of differential equation and compare it to an ordinary central processing unit.
The results show just how fast photonic neural nets can be. “The effective hardware acceleration factor of the photonic neural network is estimated to be 1,960 × in this task,” say Tait and co. That’s a speed up of three orders of magnitude.
That opens the doors to an entirely new industry that could bring optical computing into the mainstream. “Silicon photonic neural networks could represent first forays into a broader class of silicon photonic systems for scalable information processing,” say Taif and co.
And others are working in this area too. Earlier this year, Yichen Shen at MIT and a few pals proposed the architecture behind a fully optical neural network and demonstrated elements of it using a programmable nanophotonic processor.
Of course much depends on how well the first generation of electronic neuromorphic chips perform. Photonic neural nets will have to offer significant advantages to be widely adopted and will therefore require much more detailed characterization. Clearly, there are interesting times ahead for photonics.
Ref: arxiv.org/abs/1611.02272: Neuromorphic Silicon Photonics
This story was updated on November 22 to include additional work done by researchers at MIT.
Why Meta’s latest large language model survived only three days online
Galactica was supposed to help scientists. Instead, it mindlessly spat out biased and incorrect nonsense.
DeepMind’s game-playing AI has beaten a 50-year-old record in computer science
The new version of AlphaZero discovered a faster way to do matrix multiplication, a core problem in computing that affects thousands of everyday computer tasks.
A bot that watched 70,000 hours of Minecraft could unlock AI’s next big thing
Online videos are a vast and untapped source of training data—and OpenAI says it has a new way to use it.
The White House just unveiled a new AI Bill of Rights
It's the first big step to hold AI to account.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.