Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

The promise of artificial neural networks is the stuff of sci-fi movies: computers that learn and work much like humans-through experience.

In the real world, however, neural networks haven’t begun to live up to their potential. It appears that human nature is at least partially standing in the way-with a failure to communicate fully between two key groups.

Researchers gathered to discuss real-world applications of neural networks at a session of the International Joint Conference on Neural Networks held in Washington, DC, in July.

But besides research to spur the development of tiny, connected “smart” sensors for use in defense, very little of the discussions had much to do with what anyone outside academia would consider the real world.

Instead, researchers outlined the need to better connect neural network technology with current research about how the human brain works.

Missing a Core Connection

Jim Olds, director of George Mason University’s Krasnow Institute for Advanced Study in Fairfax, VA, remarked that the information about brain function that computer scientists have been relying on is about 30 years old.

“The field of artificial neural networks uses the word ‘neural’ in a very liberal fashion, with the idea that their software constructs are inspired by what we know about the nervous system,” Olds said. “But most of the knowledge that folks use to inspire them dates from neuroscience studies around 1965.”

This is because two key groups in neural network development-the neuroscientists who conduct neurological research and the computer scientists who develop neural modeling and programming-aren’t communicating enough. “Neuroscientists work at teaching hospitals or at places like NIH [National Institutes of Health]; people doing artificial neural networks might be at Carnegie Mellon. The natural overlap is zilch,” Olds commented.

Needed: Neuroinformatics

What’s the solution? Some say it’s neuroinformatics-using information technology to better understand the functioning of the human brain and to help track and share research.

“There are almost 30,000 neuroscientists in the U.S. and 60,000 worldwide; there’s information overload,” said Stephen Koslow, director of the Office on Neuroinformatics at the National Institute of Mental Health in Bethesda, MD. “Informatics solves all our problems of access, sharing and collaborating.”

NIMH launched a neuroinformatics program in 1999 to pool information and model how the brain works. As part of its Human Brain Project, the program aims to set up a state-of-the-art information management system through cooperative efforts by neuroscientists and information scientists (including computer scientists, engineers, physicists and mathematicians).

With the strides the research community is making in understanding the brain, there’s no lack of information that could help to jump-start neural networking into a more productive involvement with neuroscience. And from this involvement a far richer set of real world applications should develop.

This may just be a simple question, proponents say, of using information technology better to create better information technology.

0 comments about this story. Start the discussion »

Tagged: Business

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me