Skip to Content
MIT News feature

A Less-Artificial Intelligence

Studying 70,000 mouse neurons could help Andreas Tolias build smarter AI.
February 21, 2018
Adrian forrow

A fair number of engineers working on artificial intelligence don’t care whether their systems resemble real brains or not, as long as they perform well. But even today’s best systems can generalize only if fed thousands of samples, and they can’t transfer their generalizations to new contexts. This leaves AI vulnerable to attackers, who can trick it with tiny tweaks to the data. Neuroscientist Andreas Tolias believes that brain-like features could fix these problems.

In 2016, he founded Neuroscience-Inspired Networks for Artificial Intelligence (NINAI), a tag team of neuroscientists, physicists, mathematicians, and computer scientists that’s part of a larger effort to understand neural function (see “Inside the Moonshot Effort to Finally Figure Out the Brain,”  November/December 2017). Their relay race toward better AI starts in Tolias’s lab at Baylor College of Medicine, which records all the neurons firing inside a one-millimeter cube of a mouse’s cortex. In December, they captured the activity of 70,000 neurons in one mousea feat that would have been impossible without the two- and three-photon imaging techniques Tolias’s lab helped advance. The mice then go to the Allen Institute in Seattle, which slices and photographs their brains so a third team, at Princeton, can diagram which neurons are connected. By comparing this diagram with their recordings, Tolias’s lab deduces how the cells are influencing each other and what purpose each cell serves. If, as many neuroscientists suspect, the cortex is essentially built from a few common, repeated configurations of neurons, then explaining the activity in a one-millimeter cube could reveal the building blocks for all cognition.

Adrian Forrow

Tolias has zeroed in on two key structural differences between brains and AI. First, a mouse’s brain has roughly a hundred types of neurons, while a typical AI network has only two or three varieties of artificial neurons. The brain’s extra cell types include interneurons, which can stop large groups of other neurons from firing. AI has no direct equivalent. Brains also have more types of connections between neurons than AI networks do. Most AI networks are “feed-forward,” meaning signals only go in one direction, from one layer of the network to the next. Unlike real brains, these networks don’t have recurrent connections (which allow feedback signals in opposite directions) or lateral connections (which link neurons within the same layer). The few types of AI networks with recurrent and lateral connections show promise, but the role of feedback in the cortex needs much more study. “The brain didn’t create all this recurrence for the fun of it,” says Tolias. He also suspects interneurons may be regulating the brain’s lateral connections to create the generalizing powers that AI lacks.

Tolias hopes to use neuro-inspired components, including lateral connections, interneurons, and feedback, to build AI capable of one-shot learning, or generalizing from a single example. Success would be a big deal for AI, and for neuroscience, by identifying which features of neural circuits are needed for abstract thought. Tolias explains his quest in the words of Richard Feynman: “What I cannot create, I do not understand.”

Keep Reading

Most Popular

DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.

“This is a profound moment in the history of technology,” says Mustafa Suleyman.

What to know about this autumn’s covid vaccines

New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.

Human-plus-AI solutions mitigate security threats

With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure

Next slide, please: A brief history of the corporate presentation

From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.