Skip to Content

A new computer chip mimics the neurocircuitry of our noses to smell

It draws inspiration from the structure and electrical activity of the brain to distinguish between odors.
March 16, 2020
Nabil Imam with Intel's Loihi chip.
Nabil Imam with Intel's Loihi chip.
Nabil Imam with Intel's Loihi chip.Intel

Of all the things our brain can do, the way it helps us smell is one of the best understood. When an odor hits the olfactory cells in our nose, they send a signal to the corresponding cluster of neurons in the brain known as the olfactory bulb. The bulb then ferries the signal out to other parts of the brain, allowing us to appreciate the perfume of a grapefruit or avoid the stench of trash.

Olfactory bulbs are specific to mammals, but other animals, like insects, also exhibit similar neural structures. It means “there’s probably something fairly fundamental and efficient about these implementations if evolution has arrived on them in different cases,” says Mike Davies, the director of Intel’s Neuromorphic Computing Lab.

Both because they are so efficient and because we understand them so well, olfaction systems are great starting point for neuromorphic chips, a new type of computing hardware that takes inspiration directly from the structure of the brain.

On Monday, scientists at Intel published a paper in Nature that proposes a new neuromorphic chip design that mimics the structure and capabilities of the olfactory bulb. The researchers worked with olfactory neurophysiologists who study the brains of animals as they smell. They designed an electrical circuit, based on the neural circuits that activate when their brains process an odor, that could be carved onto a silicon chip. They also designed an algorithm that mirrors the behavior of the electrical signals that pulse through the circuit. When they trained the algorithm on the chip using an existing data set of 10 “smells”—characterized by their measurements from 72 different chemical sensors—it was able to accurately distinguish between them with far fewer training samples than a conventional chip.

The chip is still a relatively early-stage prototype, but once mature it could serve a number of applications, such as bomb sniffing or the detection of noxious fumes in chemical plants. It also demonstrates the potential of neuromorphic computing for more data-efficient AI.

Currently the most popular chips for running state-of-the-art deep-learning algorithms all follow a von Neumman architecture, a design convention that has powered the computing revolution for decades. But these architectures are inefficient learners: the algorithms that run on them require massive amounts of training data, in contrast to our far more efficient brains. Neuromorphic chips, therefore, try to preserve the brain’s structure as much as possible. The idea is that such close mimicry will increase the chip’s learning efficiency. Indeed, Intel successfully got the chip to learn from very few samples.

Moving forward, the research team plans to improve the design of its neuromorphic chip and apply it to other functions of the brain beyond smell. Davies says the team will likely turn its attention to vision or touch next but has longer-term ambitions to tackle more complex processes. “Our sensing mechanisms are the natural place to start because these are well understood,” he says. “But in a sense we’re working our way in and into the brain, up to the higher-order thought processes that happen."

Deep Dive


This new startup has built a record-breaking 256-qubit quantum computer

QuEra Computing, launched by physicists at Harvard and MIT, is trying a different quantum approach to tackle impossibly hard computational tasks.

afghanistan coding program
afghanistan coding program

The code must go on: An Afghan coding bootcamp becomes a lifeline under Taliban rule

In Afghanistan, tech entrepreneurship was once promoted as an element of peace-building. Now, young coders wonder whether to stay or go.

ASML machine
ASML machine

Inside the machine that saved Moore’s Law

The Dutch firm ASML spent $9 billion and 17 years developing a way to keep making denser computer chips.

broken pieces of log4j
broken pieces of log4j

The internet runs on free open-source software. Who pays to fix it?

Volunteer-run projects like Log4J keep the internet running. The result is unsustainable burnout, and a national security risk when they go wrong.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.