Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo


Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Unlike most neuroscience labs, Kwabena Boahen’s lab at ­Stanford University is spotless–no scattered pipettes or jumbled arrays of chemical bottles. Instead, a lone circuit board, housing a very special chip, sits on a bare lab bench. The transistors in a typical computer chip are arranged for maximal processing speed; but this microprocessor features clusters of tiny transistors designed to mimic the electrical properties of neurons. The transistors are arranged to behave like cells in the retina, the cochlea, or even the hippocampus, a spot deep in the brain that sorts and stores information.

Boahen is part of a small but growing community of scientists and engineers using a process they call “neuromorphing” to build complicated electronic circuits meant to model the behavior of neural circuits. Their work takes advantage of anatomical diagrams of different parts of the brain generated through years of painstaking animal studies by neuroscientists around the world. The hope is that hardwired models of the brain will yield insights difficult to glean through existing experimental techniques. “Brains do things in technically and conceptually novel ways which we should be able to explore,” says ­Rodney ­Douglas, a professor at the Institute of Neuroinformatics, in Zurich. “They can solve rather effortlessly issues which we cannot yet resolve with the largest and most modern digital machines. One of the ways to explore this is to develop hardware that goes in the same direction.”

Among the most intriguing aspects of the brain is its capacity to form memories–something that has fascinated neuroscientists for decades. That capacity appears to be rooted in the hippocampus, damage to which can lead to amnesia.

Extensive studies of neurons in the hippocampus and other parts of the brain have shed some light on how neural behavior gives rise to memories. Neurons encode information in the form of electrical pulses that can be transmitted to other neurons. When two connected neurons repeatedly fire in close succession, the connection between them is strengthened, so that the firing of the first helps trigger the firing of the second. As this process–known to neuroscientists as Hebbian learning–occurs in multiple neighboring cells, it creates webs of connections between different neurons, encoding and linking information.

To better understand how this works, Boahen and graduate student John Arthur developed a chip based on a layer of the hippocampus known as CA3. Sandwiched between two other cellular layers, one that receives input from the cortex and one that sends information back out again, CA3 is thought to be where memory actually happens–where information is stored and linked. Pointing to a diagram of the chip’s architecture, ­Boahen explains that each model cell on the chip is made up of a cluster of transistors designed to mimic the electrical activity of a neuron. The silicon cells are arranged in a 32-by-32 array, and each of them is programmed to connect weakly to 21 neighboring cells. To start with, the connections between the cells are turned off, mimicking “silent synapses.” (A synapse is a junction between neurons; a silent synapse is one where, if a given neural cell fires, it transmits a slight change in electrical activity to its neighbors, but not enough to trigger the propagation of an electrical signal.)

4 comments. Share your thoughts »

Credit: Emily Nathan

Tagged: Biomedicine

Reprints and Permissions | Send feedback to the editor

From the Archives


Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me