Moore’s Law: Beating The Noise Problem
We’re often told that Moore’s Law promises an exponential increase in the density of transistors on a chip, but we hear much less about the challenges this generates. One of these is the noise problem.

As transistors become smaller and their power requirements drop, noise becomes an increasingly difficult to combat. The result is that chipmakers are being forced to accept a higher error rate in computations.
But in certain nonlinear systems, particularly biological ones, researchers have long known that instead of swamping signals, noise can play the opposite role, helping to enhance them. The phenomenon is known as stochastic resonance and it has been observed in systems such as neurons and even exploited to improve the perception of certain signals.
It’s relatively straightforward to demonstrate the phenomenon using a ring of identical oscillators driven by a harmonic signal. The harmonic signal generates a travelling wave around the ring but this quickly dissipates after the signal is switched off. Add noise to the system, however, and the travelling wave survives for much longer.
Could there be a way of exploiting stochastic resonance to make computer memory, ask a team from the Instituto Tecnologico de Buenos Aires in Argentina. Their idea is to build a resonator consisting of just two oscillators. They show that such a resonator is able to store a single bit of information in a noisy environment, even after the driving frequency is switched off. They have even built a device that stores a single bit of data in this way.
What isn’t clear, however, is exactly what kind of improvement would be possible on the nanoscale at which a real memory element would have to work. That’s obviously something for the future.
For now, these guys have a clever idea that could have important implications for data storage in future.
Ref: arxiv.org/abs/0911.0878: One-Bit Stochastic Resonance Storage Device
Keep Reading
Most Popular
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Video: Geoffrey Hinton talks about the “existential threat” of AI
Watch Hinton speak with Will Douglas Heaven, MIT Technology Review’s senior editor for AI, at EmTech Digital.
Doctors have performed brain surgery on a fetus in one of the first operations of its kind
A baby girl who developed a life-threatening brain condition was successfully treated before she was born—and is now a healthy seven-week-old.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.