Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo


Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

To create a computer as powerful as the human brain, perhaps we first need to build one that works more like a brain. Today, at the International Joint Conference on Neural Networks in Dallas, IBM researchers will unveil a radically new computer architecture designed to bring that goal within reach. Using simulations of enormous complexity, they show that the architecture, named TrueNorth, could lead to a new generation of machines that function more like biological brains.

The announcement builds on IBM’s ongoing projects in cognitive computing. In 2011, the research team released computer chips that use a network of “neurosynaptic cores” to manage information in a way that resembles the functioning of neurons in a brain (see “IBM’s New Chips Compute More Like We Do”). With TrueNorth, the researchers demonstrate a way to use those chips for specific tasks, and they show that the approach could be used to build, among other things, a more efficient biologically inspired visual sensor.

“It doesn’t make sense to take a programming language from the previous era and try to adapt it to a new architecture. It’s like a square peg in a round hole,” said Dharmendra S. Modha, lead researcher. “You have to rethink the very notion of what programming means.”

In a series of three papers released today, Modha’s team details the TrueNorth system and its possible applications.

Most modern computer systems are built on the Von Neumann architecture—with separate units for storing information and processing it sequentially—and they use programming languages designed specifically for that architecture. Instead, TrueNorth stores and processes information in a distributed, parallel way, like the neurons and synapses in a brain.

Modha’s team has also developed software that runs on a conventional supercomputer but simulates the functioning of a massive network of neurosynaptic cores—with 100 trillion virtual synapses and two billion neurosynaptic cores.

Each core of the simulated neurosynaptic computer contains its own network of 256 “neurons,” which operate using a new mathematical model. In this model, the digital neurons mimic the independent nature of biological neurons, developing different response times and firing patterns in response to input from neighboring neurons.

“Programs” are written using special blueprints called corelets. Each corelet specifies the basic functioning of a network of neurosynaptic cores. Individual corelets can be linked into more and more complex structures—nested, Modha says, “like Russian dolls.”

TrueNorth comes with a library of 150 pre-designed corelets, each for a particular task. One corelet can detect motion, for example, while another can sort images by color. Also included with TrueNorth is a curriculum to help academics and, eventually, customers learn to use the system.

Karlheinz Meier, co-director of the European Union’s Human Brain Project, says that untraditional computing architectures like TrueNorth aren’t meant as a replacement for existing devices but as gateways into entirely new markets for technology. They might, for example, be used to solve some problems involving big data that the traditional Von Neumann approach cannot untangle.

“If you look at which architecture can already [solve these problems] today, it’s the brain,” says Meier. “We learn from data. We do not have predetermined algorithms. We are able to make predictions and causal relationships even in situations we have never seen before.”

For example, the researchers hope to use TrueNorth to develop systems as powerful as human vision. The brain sorts through more than one terabyte of visual data each day but requires little power to do so. IBM and iniLabs, a partner company in Zurich, plan to involve TrueNorth in the development of a visual sensor.

The team envisions the technology one day making its way into everyday machines like smartphones and automobiles. They plan to continue refining the software, which is derived from a basic model of how the brain functions and is not restricted by enduring questions about how the brain really works.

“At this point, we are not wanting for more insights from neuroscience today. We are not limited by it,” says Modha. “We are extending the boundaries of what computers can do efficiently.”

40 comments. Share your thoughts »

Credit: IBM Research

Tagged: Computing, Materials, IBM, neuroscience

Reprints and Permissions | Send feedback to the editor

From the Archives


Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me